Previous Article in Journal
Unplugged Activities in the Development of Computational Thinking with Poly-Universe
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Analyzing Player Behavior in a VR Game for Children Using Gameplay Telemetry

by
Mihai-Alexandru Grosu
* and
Stelian Nicola
Department of Automation and Applied Informatics, Polytechnic University of Timisoara, 300006 Timisoara, Romania
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2025, 9(9), 96; https://doi.org/10.3390/mti9090096
Submission received: 14 August 2025 / Revised: 29 August 2025 / Accepted: 5 September 2025 / Published: 9 September 2025

Abstract

Virtual reality (VR) has become increasingly popular and has started entering homes, schools, and clinics, yet evidence on how children interact during free-form, unguided play remains limited. Understanding how interaction dynamics relate to player performance is essential for designing more accessible and engaging VR experiences, especially in educational contexts. For this reason, we developed VRBloons, a child-friendly VR game about popping balloons. The game logs real-time gameplay telemetry such as total hand movement, accuracy, throw rate, and other performance related gameplay data. By analyzing several feature-engineered metrics using unsupervised clustering and non-parametric statistical validation, we aim to identify distinct behavioral patterns. The analysis revealed several associations between input preferences, movement patterns, and performance outcomes, forming clearly distinct clusters. From the performed analysis, input preference emerged as an independent dimension of play style, supporting the inclusion of redundant input mappings to accommodate diverse motor capabilities. Additionally, the results highlight the opportunities for performance-sensitive assistance systems that adapt the difficulty of the game in real time. Overall, this study demonstrates how telemetry-based profiling can shape the design decisions in VR experiences, offering a methodological framework for assessing varied interaction styles and a diverse player population.

Graphical Abstract

1. Introduction

Immersive technologies, such as virtual reality (VR), augmented reality (AR), and mixed reality (MR), often referred to under the general term of extended reality (XR), have become increasingly influential across a wide range of domains. These range from entertainment and education to more specialized fields like healthcare and workplace training. VR offers the most complete sense of presence, out of all immersive technologies allowing users to feel visually and physically situated inside a 3D environment [1]. With the growing adoption of standalone VR headsets such as Meta Quest, these devices are increasingly finding their way into homes, educational, and healthcare settings. This has led to many new opportunities for children to engage with immersive digital content [2,3]. This trend is largely driven by the increasing affordability of all-in-one VR devices, eliminating the need for other external sensors or high-end gaming PCs, and increasing their portability by allowing them to be used on the spot when needed [4].
Interaction in VR environments is typically achieved through head-mounted displays (HMD) and tracked hand controllers. Actions such as grabbing, pointing, or launching objects are executed using physical button presses, which index-finger trigger or the side mounted grip being the most common [5]. While these input schemes generally work well for adult users, they implicitly assume a certain degree of manual dexterity and hand size and strength, which may not be present in all user groups. Previous studies have noted certain ergonomic and usability challenges with VR games, particularly for younger users and older adults, who may struggle to operate trigger-based inputs reliably, especially during rapid or repeated tasks [6,7]. In response, some developers have introduced redundant input mappings, allowing multiple buttons to perform the same action, to accommodate a wider range of motor abilities [8].
Despite these design considerations, empirical data on how children naturally interact with VR systems, in unstructured, play-oriented settings, remains scarce. Most existing studies are conducted in controlled or therapeutic contexts, where children are typically given instructions or training prior to the activity [6,9]. Furthermore, the available interactions are usually constrained to a single, fixed input method, predefined by researchers [2,10]. These designs tend to limit the ability to observe natural preferences or strategies in how children across various ages choose to interact with immersive environments.
To investigate this, we have developed VRBloons, a custom short-form VR game, designed to gather interaction data from children during free-form play. The game takes place in a hot air balloon that roams around an open map which resembles peaceful countryside. The players throw darts in order to pop as many small floating balloons as possible within three-minute sessions. Both the grip and the trigger buttons were assigned to the throwing action for both hand controllers, allowing the players to choose their preferred input method without being guided toward one or the other. During gameplay, the system logs extensive telemetry data, including score, darts fired, balloon hits, head and hand movement, button press counts, and other derived performance metrics.
The goal of this study is to investigate how children of varying ages interact with a VR game that supports multiple input methods and to analyze behavioral patterns observed during gameplay. By applying cluster analysis and defining a range of derived interaction metrics, we aim to identify user profiles and input preferences. We also seek to provide guidelines for designing more inclusive VR experiences that accommodate players of all ages.
From this analysis, four distinct behavioral player profiles emerged. Two of them, one dominated by trigger use and other by grip use, both achieved high scores and accuracy, showing that multiple efficient play styles can coexist when redundant mappings are available. In contrast, the mixed input and low-tempo clusters exhibited lower accuracy and scores, reflecting more exploratory or less mature interactions. A composite Development Index further distinguishes these groups, by associating higher accuracy and more economical movement with more advanced motor maturity. Together, these findings provide evidence that flexible input mappings enable diverse age-dependent interaction styles to emerge naturally in VR.
Free-play is particularly suitable for Virtual Reality when the goal is to observe authentic interaction. Heavily scripted tasks with clear objectives channel users into specific techniques, while child-directed activity enables the emergence of natural preference in controller use and movement patterns [11]. By combining ecological validity with instrumented measurement, VR enables the study of authentic interaction under minimal constraints while also capturing precise telemetry [12]. For an objective focused on profiling input preference and movement economy rather than evaluating a specific learning outcome, a minimally constrained play session offers ecologically valid observation while still allowing for instrumented measurement [11].

2. Related Works

Previous research on VR interaction (specifically with children) covers a wide range of approaches, but the findings are scattered across different domains and study designs. To provide a clearer overview, we organized the related works into five categories that reflect the main perspectives relative to our study. The first one reviews educational and therapeutic studies, where children usually engage with VR in guided or clinical contexts. The second one addresses input methods and interaction design, focusing on how different mappings and interactions influence usability. The third category examines ergonomics and hardware accessibility, highlighting how device design constrains participation. In the fourth category, free-play and open-ended interactions are presented, where natural preference can be observed with minimal guidance. Finally, the fifth category looks at gameplay analytics and telemetry, which provide methodological backgrounds for profiling user behavior. Organizing the papers into these categories allows us to compare existing findings across these perspectives and position our study in relation to them.

2.1. Educational and Therapeutic Contexts

Several studies with children tend to utilize guided tasks or clinical protocols. They show strong engagement and learning potential, yet their structure limits observation of spontaneous input preferences, which is the gap our free-play design targets.
Research on VR interaction shows that children tend to actively engage with immersive environments when they are free to explore and make meaningful choices. For example, Pellas et al. highlighted that avatar-mediated activities and flexibility of in-game options facilitate creative reflection and cognitive engagement among children in educational 3D games. However, they also note usability challenges related to the lack of child-focused design guidelines and considerations of age-specific needs [13]. In a similar manner, Fernandes et al. report that children who interact with educational VR games show high levels of presence and enthusiasm. They also encounter challenges due to hardware limitations and input precision, particularly with devices like Leap Motions and Kinect [14]. For children with autism spectrum disorder (ASD), Stasolla et al. found that VR offers a safe space for practicing social skills. Moreover, flexible input options and participatory designs—such as involving caregivers—proved to be effective for boosting player accessibility and engagement [15]. Fowler et al. highlight that input methods where children control avatars with natural body movements increase immersion and enjoyment but also come with technical usability issues. They also noted that such issues might be addressed through an iterative user-centered development process [16].

2.2. Input Methods and Interaction Design

A growing number of studies have been focused on exploring different input methods and button mappings (e.g., controller vs. hand tracking, fixed vs. flexible mappings). This is central to one core design implication of our study which focuses on redundant input mappings which let children express natural preferences.
When it comes to input methods and interaction designs, recent studies highlight their importance for the accessibility and usability of VR games. Dudley et al. explicitly discusses the issues of standard VR controllers for users with limited dexterity or younger users, highlighting the importance of alternative input methods, such as hand tracking, gaze, and voice control. Additionally, they strongly advocate for flexibility and customization of input mapping, allowing users to modify controls to their individual needs [17]. Similarly, Kamińska et al. address the benefits of intuitive and natural input methods, including hand tracking and speech, for improving user experience in VR. Additionally, they warn that poorly designed UI layouts or confusing input mappings can negatively impact usability, particularly for children [18]. Johnson et al. provide empirical evidence that controller-based interactions outperform hand tracking in terms of precision and ease of use, primarily due to superior feedback and reliability. However, they also note that users tend to prefer input methods that closely resemble real-world actions [5]. Similarly, Rainer et al. evaluated input methods in a first-aid educational game, showing that while controllers provide higher usability, precision and ease of use than hand tracking, both methods were effective for gameplay [19]. In order to address technical challenges, some researchers like Lee and Shin have proposed flexible input mapping systems that support a wide range of standard and non-standard VR devices [20]. Other recent works, such as Urech et al. directly compared mapped versus freely assignable button interactions in a VR tutorial for two age groups. The results show that flexible input options benefited adolescents, while adults performed better with more conventional mappings [21]. Overall, there is a clear trend towards more flexible, multimodal, and user-centered input designs that can improve accessibility and usability of VR for diverse user groups, including children.

2.3. Ergonomics and Accessibility

Another important point to consider when studying VR interaction in children is the evidence on hardware form factors, such as the ability to comfortably reach certain buttons, strength, and fatigue. These constraints can suppress performance regardless of software design. Therefore, understanding them contextualizes our findings of the analyzed telemetry.
While much research has been conducted on the importance of flexible input methods in VR, the ergonomics of VR input devices remain a critical challenge. Standard controllers are typically designed for adult hands, leading to difficulties for children or users, such as the inability to comfortably reach all buttons, sustaining a strong grip or avoiding strain during extended use [17]. Galindo et al. and Wentzel et al. report that users with limited strength or dexterity often experience fatigue, discomfort, or even require outside assistance to operate VR hardware. These barriers might not only limit comfort but even lead to exclusion from VR activities, regardless of the software interface adaptations [22,23]. Additionally, repetitive gestures or poorly placed UI elements can quickly lead to exhaustion for young users, as shown by Kamińska et al. [18]. In conclusion, the physical form and ergonomic design of VR devices play an important role in determining who can fully participate in immersive experiences.

2.4. Free-Play and Open-Ended Interaction

Prior research has also examined VR use in minimally scripted or exploratory settings, emphasizing how child-directed activity can reveal authentic interaction strategies. These works are particularly relevant to our study, as they align with our goal of observing natural input preferences without imposing a strict guided task structure. Unstructured interaction provides opportunities to observe behavior that is closer to natural play, while VR environments combine ecological validity with precise measurement capabilities [12]. Presence research characterizes how immersion, sensorimotor contingencies, and plausibility support realistic responses in VR, and how task design and available affordances modulate both presence and performance [24]. In educational scenarios, replay analysis has been used to capture divergence from intended solutions in open-ended tasks, pointing to design improvements [25]. Despite these contributions, only a small number of studies have combined exploratory conditions with redundant input mappings and systematic telemetry collection.

2.5. Gameplay Analytics and Telemetry

Finally, it is important to discuss the methods for logging, clustering, and visualizing player behavior. These studies establish the analytical toolkit we use to profile interaction styles. A growing number of researchers have demonstrated the essential role of analytics and data collection in understanding player behavior and optimizing game experiences. Foundational work by Drachen et al. has established the use of large-scale gameplay telemetry and clustering algorithms such as k-means to identify player archetypes and behavior patterns in commercial games [26,27]. Building on these approaches, more recent studies have applied telemetry and replay analysis to VR games and immersive experiences. For example, Gagnon et al. proposed a replay-based system for VR learning games that captures head, hand, and action telemetry during gameplay [28]. This data enables the analysis of player interactions and automated detection of usability issues across larger audiences. Similarly, Harpstead et al. used replay logs and clustering techniques in educational games to analyze player actions in open-ended environments [25]. This approach allows researchers to prototype new measures of learning and identify when player actions diverge from intended solutions, revealing opportunities for design improvements. Wallner and Kriglstein showed that visualizing gameplay telemetry can reveal key player behavior like decision points, engagement patterns, and usability bottlenecks, thus helping identify usability issues in serious games [29]. Collectively, these analytics-driven methods provide deep insight into player behavior and effective designs, enabling iterative improvements.

2.6. Comparative Synthesis and Positioning of Our Study

In summary, while previous research has provided valuable insights into VR interaction, input methods, and player analytics, most studies have relied on fixed input schemes, structured tasks, or controlled experiments. These constraints tend to limit the opportunity to observe natural player behavior, or to identify interaction patterns and player preferences.
By adopting flexible input mapping and collecting detailed telemetry during gameplay sessions with children, our study tries to address these gaps. By analyzing the data gathered from children during unstructured gameplay sessions (N = 151) and without providing additional guidance or an imposed objective, this study aims to identify behavior patterns and input preferences of children aged 4 to 13 in VR game experiences. Unlike educational and therapeutic studies that relied on scripted tasks, our results capture spontaneous strategy in a free-play setting. Additionally, in contrast to studies limited to single input schemes, our data shows that comparable performances can be achieved with different preferred inputs. Finally, by integrating telemetry-driven analysis with an ecologically valid, unguided context, our study extends existing analytics-based approaches and provides a better understanding of how children naturally interact in VR.

3. Methodology

The following section details the design, development, and data collection process used in VRBloons. First, we will present the game’s setup and mechanics, focusing on the choices made to optimize usability and engagement for the target audience. Next, we outline the development tools and target hardware platform. After that, we will explain our approach to collecting detailed gameplay telemetry, including the types of data logged during each session. Finally, we will present the procedure for data analysis and visualization in order to identify behavioral patterns and input preferences among players.
Figure 1 provides a high-level overview of the VRBloons system and the analytical pipeline. The process begins with the design and deployment of the VRBloons game, where children engage in short free-play sessions. During gameplay, the system automatically logs a structure set of telemetry metrics, including performance outcomes, input usage, and spatial movement. After each session, the data is serialized and saved in JSON format, and then later converted to a tabular CSV file for analysis. A data preparation stage then filtered the invalid sessions by eliminating outliers (e.g., very small score, very few throws). After that, through feature engineering, a set of derived metrics is computed from the raw data to be used for the analysis. These features were normalized and aggregated into a standardized matrix, which formed the basis for dimensionality reduction (PCA) and clustering (k-means) to detect distinct player profiles. Finally, the results are displayed through various tables and figures, and statistical validation and robustness checks are performed to ensure that the clusters reflect meaningful behavioral distinctions.

3.1. Game Setup and Design

VRBloons was designed to provide a safe, easy to understand, and enjoyable experience for children of all ages. The game takes place in a bright, colorful low-poly countryside environment, selected for its visual simplicity and broad appeal to younger players (see Figure 2). To better understand how the game mechanics work, a gameplay video is available on GitHub (git version 2.47.1) [30]. The scene is based on a free asset available on the Unity Asset Store, while the rest of the 3D models were taken from Sketchfab. The players are placed inside a hot air balloon basket, following a predetermined trajectory defined using spline curves. The balloon’s path forms a closed loop, and it moves at a constant speed, taking exactly three minutes to complete one circuit. This design ensures a consistent experience across all players, allowing us to observe and compare the results, while also minimizing the risk of motion sickness, which is a known concern for first-time VR users [31].
The gameplay is divided into three stages:
  • First the balloon begins ascending, and a ten-second countdown is displayed to the player.
  • The second stage represents the main gameplay phase and consists of a three-minute flight during which smaller balloons spawn around the player and can be popped by throwing darts at them.
  • Lastly, the game ends once the balloon reaches the end of the spline and starts descending back to the beginning, while also displaying the final score stats.
During the main gameplay stage, players throw darts at floating balloons that randomly spawn in front of the hot air balloon’s path. Balloon spawning is governed by a weighted random number generator, which selects balloon types based on predefined probability weights. The control scheme was designed to be highly accessible, allowing players to throw darts using either the trigger or grip buttons on both left and right controllers, individually or at the same time. This flexibility allows players to discover and use whichever hand, finger, or button combination feels most comfortable and natural to them. Throwing requires simply aiming the controller in the desired direction and pressing a button, making it accessible to children of all ages. Darts use realistic physics, arcing and dropping due to gravity, so hitting far-off balloons may require players to adjust their aim trajectory, encouraging experimentation. Balloons are color-coded by rarity, with rare balloons appearing less frequently but granting higher point values. The full list of balloon types, corresponding score values, and spawn weights is presented in Table 1.
To further enhance accessibility and reduce cognitive load, the user interface is intentionally minimalist, with no interactive menus or buttons. Only essential visual cues such as the countdown, pop-up score indicators, or the end-game scoreboard are displayed automatically in order to create a straightforward experience for the younger children. These choices are in line with other HCI (human–computer interaction) studies which show that simple, consistent, and uncluttered interfaces facilitate better engagement and usability for young audiences [32]. Lastly, for better immersion, elements such as particle effects or sounds were included for certain actions like balloon popping or throwing the darts. Together, these increase player satisfaction and reinforce successful actions, as recommended by other studies about meaningful feedback in VR games [33].
We selected a simple task of throwing darts with immediate, visible outcomes to minimize learning time for younger players and to engage visuomotor coordination. This is consistent with embodied, feedback rich interaction principles in educational VR [11]. We also chose simple mechanics that can be readily understood and played even by children with no prior VR experience, ensuring accessibility across a wide player range. The fixed three-minute session was decided upon in order to standardize exposure for fair comparison. In addition, the short duration reflected the conditions of the public fair, where there was a high influx of players. Limiting each game to three minutes ensured that everyone who wished to experience immersive VR had the opportunity to participate. Additionally, we wanted to limit fatigue and cybersickness risk, as exposure duration and task demands are known determinants of cybersickness [34,35]. Mapping the same action to both trigger and grip buttons was intentional, as it allows for input preference without instruction bias. This is in line with findings that controller techniques systematically affect speed-accuracy trade-offs in VR [36]. Additionally, the hot-air balloon followed a fixed on-rails path to reduce exposure to continuous steering-based optic flow, which is associated on average with higher cybersickness than teleportation, while still allowing head and hand movement logging [37]. This supports our goal of profiling input preference and movement economy, rather than evaluating a specific educational outcome.

3.2. Development and Target Platform

The VRBloons game was developed using the Unity engine version 2022.3 LTS, chosen for cross-platform support and its robust XR development toolkit. Unity’s XR Interaction Toolkit was used to handle input management and device tracking, while the game logic and mechanics were written using custom C# scripts. All development was conducted on a Windows 10 system, using Rider as the IDE of choice, with builds deployed for the Meta Quest 2.
We decided to target the Meta Quest 2 as our primary device, due to several factors. First, this device is one of the most widespread VR headsets, while also being much more affordable. Although not as powerful as PC-based headsets, the Meta Quest 2 features a fully integrated tracking system, allowing for both controller-based and hand-based interaction. Additionally, its all-in-one capabilities make it an ideal choice for deployment in schools or other public spaces, as it does not require a complex setup with an external PC and additional sensors.
Since Meta Quest 2 operates with a relatively weak mobile-grade GPU [38], several optimizations had to be made to maintain a consistent frame rate of 72 FPS, which is critical for reducing motion sickness. First, all the models that were chosen were low-poly, not only due to their aesthetic, but also to limit the total number of polygons to between 50 k and 100 k. To further optimize rendering performance, Level of Detail (LOD) groups were applied to all environmental props, allowing distant objects to be culled automatically. Additionally, all shadows were disabled, and only one directional light was used to light the scene. To minimize memory usage and avoid filling the scene with objects, all balloons were destroyed after a few seconds or after being popped.
In order to simplify testing and deployment, a custom developer shortcut was introduced via a hidden button combination (holding A and B buttons simultaneously for five seconds). This combination resets the scene, positions the camera back to its initial state, and saves the data of the last session to a JSON file, before clearing it for the next session. This ensured that all data was properly saved, and that the device would be ready for the next gameplay session.

3.3. Data Collection

To facilitate the analysis of player behavior and interaction preferences, VRBloons was instrumented to collect detailed telemetry throughout each play session. The data collection system was integrated directly into the game’s logic, using a combination of event-based logging and frame-by-frame motion sampling. All relevant gameplay data is stored locally in a structured JSON format and saved automatically at the end of each session. An example of the resulting JSON data structure is shown in Figure 3.
Each session begins by generating a unique playerID in the form of a GUID (Globally Unique Identifier). This identifier allows individual sessions to be tracked without linking them to any personal information, ensuring that the participants stay anonymous. A timestamp was also recorded at the beginning of each session, which tells us the day, month and hour when each session took place. Throughout gameplay, various interaction metrics were recorded in real time. We record key gameplay outcomes, such as the player’s total score, number of darts fired, balloons popped, and accuracy (balloons popped per darts fired), which give us a rough estimate of the player’s overall performance. Additionally, we also tracked input usage, including the number of trigger and grip button presses, in order to see how often each button was used.
To measure physical engagement and body movement, we also collected spatial telemetry, including HMD and both hand controllers. This includes the total movement of the headset (headMovement), the total distance traveled by the left and right controllers (leftControllerMovement and rightControllerMovement), and the total rotational ranges in yaw and pitch (lookAroundYawRange, lookAroundPitchRange). These metrics provide insight into how physically active and exploratory the player was within the virtual environment.
All collected telemetry variables are listed in Table 2, which summarizes the raw data fields gathered from each individual gameplay session.
Data collection took place at a public children’s fair hosted by the university. Four Meta Quest 2 headsets were installed at a dedicated booth outside the main faculty building and were operated by volunteers. Each child played standing within a marked play area, under supervision for safety. No personal or demographic data (such as age or gender) were recorded, ensuring complete anonymity. At the start of each session, participants were told that both the trigger and the grip button could be used to throw darts, but no further guidance was given, allowing them to choose the input method they preferred. Although the public fair setting introduced background noise and occasional distractions, it also reflected conditions closer to everyday use, making the captured behavior more natural, than in a controlled laboratory.

3.4. Data Preparation and Feature Engineering

The first step in preparing the collected data for analysis was applying a filter to remove incomplete or invalid sessions. We removed all the sessions with a total score below 50 or fewer than 10 darts thrown from the JSON file, as these were likely the results of quick resets or incomplete sessions during transitions between participants. Out of the initial 172 recorded sessions, 151 sessions remained after applying the filters and were further used for analysis. The valid sessions were then exported from the JSON format to a tabular CSV (comma-separated values) structure. This dataset was loaded into a Python-based analysis environment (Python 3.10.7), using Jupyter Notebook (ipykernel 6.30.0), and managed under Git version control for full transparency and reproducibility [30].
For a more meaningful comparison and clustering across play sessions, we constructed a feature matrix consisting of eight metrics. These variables were selected to capture core aspects of gameplay such as interaction styles, input preferences, performance, motor activity, and visual scanning. Of these, four values (accuracy, headMovement, lookAroundYawRange, and lookAroundPitchRange) are metrics collected directly from gameplay. The remaining four (grip_ratio, score_per_throw, throws_per_s, and hand_movement) were created through feature engineering to express ratios and rates that normalized raw input counts. During the game sessions we observed that most participants were right-handed and that the left controller was minimally used. Therefore, we decided to only include the data from the right controller when computing the hand_movement metric. Table 3 contains a summary of all eight metrics used in the analysis.
To prepare the dataset for distance-based modeling, all features were first winsorized at the 1st and 99th percentiles to limit the influence of outliers. Then we standardized each metric using z-score normalization (mean = 0, standard deviation = 1), ensuring equal weighting of features during clustering and dimensionality reduction. These transformations produced a numeric-only standardized feature matrix that served as the input for dimensionality reduction and unsupervised learning as described in the following subsection.
The total number of sessions analyzed was (N = 151). While modest, this sample provides a favorable ratio relative to the five-feature matrix used for clustering, making it adequate for exploratory unsupervised analysis. Our main objective was profiling to test if distinct interaction patterns exist, rather than estimating population prevalence. To reduce overinterpretation, we (i) used internal validation (e.g., silhouette) to verify that the partition was not the result of noise, (ii) reported non-parametric contrasts across clusters for each feature, and (iii) examined sensitivity to alternative k values (see Section 4.3). The findings are intended as design-relevant behavioral profiles that can transfer to similar free-play deployments. Confirming prevalence rates and developmental trends will require larger, controlled samples in future work.

3.5. Development Index

Since we did not collect any age-related data, we computed a score to summarize signals of control maturity in a single dimension, which would act as a proxy for age. The Development Index (DI) combines three standardized components (z-scored):
  • Hand and head movement—Greater travel reflects less economical motion control, which is common in younger children, while more mature motor control is associated with shorter and straighter paths [39]
  • Accuracy—as noted by other studies, accuracy in aimed actions improves with age and skill, reflecting better speed-accuracy trade-off regulation [40,41]
  • Grip use—In VR controllers, the grip button involves a whole-hand activation that aligns with a power-grip style, while the trigger resembles a precision finger action. While not a maturity marker by itself, we include grip use as a control-style indicator that may co-occur with accuracy and economy differences [42]
Each feature was transformed to a z-score. The index was then computed as:
D I = z   γ + z   θ + z   μ z   α
where γ represents grip ratio, θ is the total hand movement, μ is the total head movement, and α represents the final accuracy of the session.
Larger values indicate less motor maturity (greater reliance on grip input style, more hand travel, lower accuracy). If desired, multiplying by −1 yields an intuitive “Maturity index” without changing any inference. DI is intended as a heuristic proxy, not a diagnostic measure or substitution for age. We assessed internal validity by verifying that DI was negatively associated with accuracy, and positively associated with hand movement, and by comparing DI across clusters. (see Section 4.2 and Section 4.3).

3.6. Dimensionality Reduction and Clustering

To assess potential collinearity and to visualize the overall structure, we conducted a principal component analysis (PCA) on the standardized feature matrix. The resulting scree plot (Figure 4) shows that the first four principal components represent almost 80% of total variance. This indicates that the player’s behavior is best characterized along multiple axes, supporting our use of a multi-feature clustering approach. Additionally, we computed pairwise correlations, which revealed moderate to strong relationships only between certain features (e.g., accuracy with score per throw), while most variables remained largely independent. In addition, head movement correlated strongly with hand movement (ρ = 0.67).
To improve cluster cohesion, and retain the main focus of this study, a five-feature set was selected. Table 4 presents our final feature set used for clustering:
This subset raised mean silhouette from 0.20 (all eight metrics, k = 4) to 0.25 and removed the micro cluster formed due to yaw-range, leaving four relatively balanced sized groups. While a four-feature set matrix raised the silhouette to 0.289, we ultimately decided to go with the five-feature set, as this also contained input preference and resulted in more balanced clusters. Diagnostics for the discarded variables and for alternative feature sets (six-feature, eight-feature, PC) are provided in an accompanying exploratory notebook and produce the same performance hierarchy [30].
Clustering was performed using both k-means and Gaussian Mixture algorithms with the scikit-learn implementation [43]. To avoid arbitrary selection of cluster number (k), we computed silhouette coefficients for k ranging from 2 to 6. While the silhouette values were relatively flat (0.21 to 0.25), for the five-feature matrix, k-means with k = 4 achieved the highest silhouette (0.249) and ultimately formed the final solution.
For each metric, we first applied a nonparametric Kruskal–Wallis test to assess whether at least one cluster differed significantly from the others (using α = 0.05). When the test results were significant, we also conducted post hoc pairwise comparisons between clusters using Dunn’s test with Holm correction. To evaluate the robustness of the clusters, we performed two complementary stability checks. First, a random multiplicative noise (±5%) was added to the standardized feature matrix and the k-means clustering was repeated 10 times, calculating the Adjusted Rand Index (ARI) between the original and perturbed cluster labels. The mean and minimum ARI values quantify how consistent the cluster assignments remain after small data perturbation. We then repeated the clustering analysis at k = 3 and k = 5, calculating silhouettes for each solution to assess the sensitivity of results and to choose a different cluster number.

4. Results

The following section presents a detailed analysis of player behavior and performance in our dataset. The section begins by describing statistics and visualization of the core metrics derived from the raw data. The goal of this preliminary analysis is to highlight key behavioral trends and relationships across all play sessions, before any clustering is applied. We then proceed with the identification and interpretation of the resulting clusters. Lastly, statistical validation and robustness checks for the discovered data will be reported in the final subsection.

4.1. Feature Selection

A total of 151 valid play sessions were retained for analysis, after applying the filters described in the previous section. Players varied substantially in their behavior and performance, as can be seen by the wide ranges and distributions for each variable shown in Table 5. The median total score achieved was 2160.00 (interquartile range [IQR]: 962.50–3520.00), with score spanning a wide range, indicating diverse skill levels among the players. The median firing rate was 0.94 throws per second, with most players situated near the median. Measures of physical engagement varied as well. Hand movement was predominantly right-handed, with a median of 59.71 m (IQR: 41.81–80.74), although values varied much more than this. Head movement, particularly pitch also showed a range of head rotations, with most sessions involving substantial whole-body movement.
Pairwise relationships between key gameplay metrics are summarized in the feature-to-feature correlation matrix (Figure 5). We observed several strong links such as accuracy being highly correlated with score per throw (r = 0.76). Additionally, hand movement was closely related to head movement (r = 0.67), indicating that more active players tend to move both the headset and the controllers together.
However, input preference (grip_ratio) showed only a weak correlation with accuracy (r = −0.27), indicating that both high or low accuracy was achieved with either control scheme. This pattern can be observed in Figure 6, which plots accuracy against grip ratio for all sessions. While most participants showed a preference for either the trigger or grip button, both high and low accuracy performances were observed in both input styles. The weak negative correlation suggests that redundant input mapping allowed effective performance regardless of which button was favored. Additionally, hybrid input players tend to perform worse in terms of accuracy, indicating a more exploratory play style, compared to the others who had a strong preference.

4.2. Cluster Analysis and Identifying Behavioral Profiles

To examine the presence of hidden behavioral profiles, we applied K-means clustering (k = 4) to the standardized five-feature matrix described earlier. The optimal number of clusters was determined by maximizing the silhouette coefficient (mean silhouette = 0.254). Additionally, we excluded metrics such as yaw_range from the clustering, due to a very small cluster that would be produced from some extreme edge cases.
Cluster sizes were reasonably balanced (n = 22–46 per cluster). To visualize cluster separation, we projected the data onto the first two principal components (PCs), which together explained 62.6% of the total variance. Figure 7 displays the resulting clusters in a PC1-PC2 space, revealing some distinct, although slightly overlapping clusters. More clearly distinct clusters were observed in a three-dimensional PCA plot (Figure 8a). For a non-linear perspective, we also visualized the data using t-distributed stochastic neighbor embedding (Figure 8b), which further supports the presence of four distinct groupings, with a clearer separation.
A full summary of cluster centroids across k = 3, 4, and 5 is provided in Table 6. The k = 4 solution retained as our main model shows two high-performance clusters. C0 is the largest group (n = 46) and represents a group of players that used trigger almost exclusively as their input of choice (grip_ratio = 0.09 ± 0.09). These players achieved a high median score (3028) as well as the highest accuracy (0.54 ± 0.15), with minimal unnecessary movement. Similarly, C2, the efficient grip cluster (n = 40) also presents high accuracy (0.50 ± 0.10), and surprisingly an even higher median score (3552), with a near exclusive preference for the grip button (grip_ratio = 0.94 ± 0.12). In contrast, C1 presents a mixed input pattern group (0.69 ± 0.32), characterized by the lowest accuracy (0.27 ± 0.09), low efficiency, low median score (970), and below-average throwing rate, despite being the most physically active (82.27 ± 25.51 m hand movement). The smallest cluster (n = 22) shows moderate grip use, low accuracy (0.37 ± 0.14), low efficiency, as well as the slowest tempo and lowest hand movement (29.86 ± 14.64), leading to the lowest median score (602). Together, these profiles illustrate a diversity of player strategies among participants and confirm that high performance was achievable using either trigger or grip input. Consequently, less consistent input preference and more exploratory play was associated with lower efficiency and score. For comparison, the alternative solutions (k = 3 and k = 5) are also reported in Table 6, but they either merged the two low-performance groups or fragmented them without introducing a qualitatively new profile.
To explore if the discovered behavioral profiles were associated with developmental trends, we compared our previous Development Index (DI) across the existing clusters (Figure 9). The index increases with heavy-grip use, high controller travel, and low accuracy. Therefore, larger values denote less motor maturity, and likely younger players. The figure showed that C0 and C3 both exhibited the lowest median DI, meaning these are both more mature and developed players according to our proxy. However, while players in C0 displayed the highest accuracy and very good scores, players in C3 showed the lowest accuracy and scores. Cluster C2, which had an intermediate median DI, suggesting moderately advanced development, also achieved very good scores and accuracy. Lastly, cluster C1 showed the highest median DI value, corresponding to the least developmentally mature group. They also scored poorly compared to clusters C0 and C2, showing a more exploratory play rather than being focused on high performance. These findings indicate that while higher developmental maturity is a necessary condition for some high-performing plays (such as in C0), it does not guarantee high performance. Conversely, lower maturity is consistently linked to poorer performances. However, even among older players, differences in how they approach the game strongly influence how well they actually perform.

4.3. Statistical Validation and Cluster Robustness

To confirm that the identified clusters reflect meaningful differences in player behavior and performance, we conducted a non-parametric statistical comparison across the clusters for each core metric. Kruskal–Wallis tests indicate significant heterogeneity across the four clusters for every core metric (Table 7). Score showed the strongest effect (H = 103.9, p < 0.001), followed by grip-ratio (H = 98.9) and accuracy (H = 79.6). This indicates substantial differences in performance, input preference, and precision, across all clusters.
By performing a pairwise Dunn post hoc test, we discovered three robust patterns. Both high-scoring clusters (C0 and C2) significantly outperformed C1 and C3 on score, accuracy, score per throw and fire rate. The two top performing clusters (C0 and C2) were statistically indistinguishable in score and accuracy (p > 0.30) but differed greatly on grip-ratio (p < 0.001). This confirmed that distinct input preferences can achieve comparable performances. Hand movement was the highest in C3 and lowest in C1, mirroring the trend observed in the development index.
Inferential tests were conducted at the session level because the public fair deployment was anonymous, and participant identifiers were not recorded. Therefore, possible repeated sessions could not be modeled. Under these constraints, mixed-effects or multilevel models were not applicable.
Further robustness and sensitivity checks confirmed the stability of our clustering results. By adding a ±5% Gaussian noise to the five-feature matrix and running clustering 10 times, we obtained a mean Adjusted Rand Index (ARI) of 0.71 (minimum 0.55). This indicates that the majority of session labels remained stable under modest data perturbations. Clustering at alternative values for k (k = 3, 5, and 6) resulted in lower silhouette scores (0.239, 0.229, and 0.224). Additionally, this reduced cluster stability (ARI: 0.62 for k = 3, 0.65 for k = 5, and 0.56 for k = 6). In all cases, the ranking stayed constant, with C0 and C2 performing roughly equally in terms of score and accuracy, and C1 being much lower, followed by C3.
These results confirm that the four-cluster, five-feature solution is both statistically distinct and robust to minor fluctuations in input data or model. Most importantly, the way players prefer a certain input method over the other (trigger vs. grip) remains a core feature of the resulting clusters. In addition, the Development Index showed the expected component relationships, decreasing with accuracy and increasing with hand movement, further supporting its internal validity.

5. Discussions and Conclusions

Our analysis identified four different interaction profiles among children who engaged in short, free-play VR sessions, derived from five engineered features metrics. The clusters were balanced in size and showed significant differences across all performance and interaction metrics. The two top performing clusters, C0 and C2 consistently achieved similar high scores and accuracy despite using different input schemes. This confirms that both trigger and grip focused play styles can be equally effective when multiple redundant inputs are available.
The observed profiles can be mapped into clear behavioral patterns. C0 players utilized trigger almost exclusively, have high precision, and present low unnecessary movements, while C2 players displayed grip-dominant input, faster tempo, and slightly lower precision. These profile differences are consistent with evidence that mapping the same action to different controller buttons (Trigger vs. Grip) shifts accuracy, error rates, tempo, and user preference, trigger being rated clearer/more accurate and grab as more natural [21]. In contrast, C1 showed mixed input usage with the lowest accuracy and efficiency, whereas C3 displayed high physical activity but low precision and tempo. This aligns well with the economy of motion metrics where shorter paths and smoother motions often track higher skill [44]. Strangely, C3 displayed the lowest performance, despite also having the lowest movement, indicating that low movement alone was not a guarantee of better performance. Finally, a development index combining grip use, hand movement and accuracy, indicates more mature control for C0 and C3 and less developed motor performance across C1 and C2. This is consistent with Fitts-based throughput and error difference across age groups [45].
The number and composition of profiles vary with the feature set and the choice of k. Therefore, we also inspected k = 3 and k = 5 (Table 6). These settings changed granularity but preserved two invariants: the input-preference axis (trigger vs. grip dominant) and a movement-tempo-accuracy gradient. Dropping orientation-range variables increased cohesion without altering these invariants. This means that the five-feature set preserved the same qualitative patterns seen in the eight-metric space. The four-cluster solution separated two high-performing styles and two lower-efficiency styles, whereas k = 3 merged the two low-performers, and k = 5 fragmented the groups even more without a new distinct pattern.
From a design perspective, our findings support the use of redundant input mappings to accommodate a large range of diverse motor capabilities. Guided by ability-based design, controls should adapt to users’ abilities rather than forcing a single mapping [46]. The clear difference we observe in tempo, accuracy, and movement can be a good reason to implement adaptive support. It is important to note that such support methods are not dependent on clustering but on real-time performance signals (e.g., success rate or completion time) which are standard inputs for dynamic difficulty adjustment (DDA) [47]. For example, lightweight assists like aim stabilization or hitbox scaling are consistent with telemetry-driven parameter scaling as demonstrated in DDA studies [48].
From a methodological standpoint, our analysis combined multiple validation steps to ensure that the clusters were both robust and interpretable. The four-cluster solution remained stable under data perturbations. Additionally, retaining the input-preference feature while excluding orientation metrics improved cohesion without sacrificing interpretability, which is exactly the role of targeted feature selection in clustering [49]. Finally, non-parametric tests (Kruskal–Wallis with Dunn post hoc) confirmed that differences between clusters were statistically meaningful, not just model artifacts [50,51].
Although VRBloons was not a curricular task, the interaction profiles describe control-layer behavior that is shared with many learning activities in VR. These patterns transfer to education because precision, tempo, and controller habits shape how students manipulate objects, attend to feedback, and sustain engagement in instructional scenarios [52]. In educational VR setting, the four profiles point to differentiated strategies. Learners resembling C0 (trigger-dominant, high precision, low movement) could be challenged with advanced accuracy tasks and multi-step manipulations, requiring little additional support. This is consistent with the expertise reversal effect (i.e., less guidance is beneficial for more proficient learners) [53]. Those similar to C2 (grip-dominant, fast tempo, slightly lower precision) may benefit from brief onboarding activities and gradually increasing accuracy demands, aligning with findings on the value of pre-training and requiring attentional guidance in VR learning [54]. Learners in C1 (mixed input, low accuracy/efficiency) may require guided play [55], with explicit prompts to commit to one input scheme. Additionally, short goal-oriented tasks, and obvious cues might be used to reduce uncertainty [56]. Finally, learners similar to C3 (high activity, low precision/tempo) may need pacing support, and reduced motor difficulty (e.g., enlarged targets), followed by gradual tightening of the constraints. These moves align with guided-play evidence and motor-control prediction that larger targets lower task difficulty, which is consistent with Fitts’ Law research in HCI [57]. Although we did not model frustration or disengagement, the task structure and prior DDA practice suggest practical real-time signals for adaptation. Simple real-time telemetry, such as hit rate, streaks of misses, idle time, tempo, input switch events, and excess hand travel align with performance-based adjustment heuristics [58]. For example, if repeated misses occur in a short span, the system could temporarily enable trajectory prediction for each throw, providing corrective feedback without interrupting free-play.
The analysis pipeline can be readily replicated using the telemetry schema and scripts provided (see Data Availability). Although the VRBloons game itself is not distributed, any comparable VR task with redundant mappings and the same logged features (accuracy, throws, hand movement, etc.) would yield equivalent inputs for clustering. Replication under similar public play conditions is feasible, but task complexity, control mappings, and session duration could be adjusted in future studies to test the generality of the framework across different educational or entertainment contexts. For example, a VR height-exposure task could use the same schema to log gaze toward the edge, step-back events, and head/hand movement, enabling clustering of interaction profiles that reflect different coping strategies under stress.
Nevertheless, we must acknowledge several limitations regarding our study. We generated a unique session ID but not a player ID, so repeated sessions by the same child may exist. Although the likelihood of repeat participation during the event was quite low due to the relatively short time and the high influx of players, this possibility weakens strict independence assumptions for our non-parametric tests [50]. Although telemetry from both controllers was logged, analysis focused on the right hand due to the very small number of left-handed participants. This introduces a right-hand bias which also limits generality, as hand dominance can change kinematics and performance in VR [59]. Moreover, the absence of temporal logs and posture measures constrain deeper analyses of play dynamics and ergonomics, which are known to influence interaction metrics [60]. Finally, the public fair setting introduced uncontrolled noise and distractions, which may have affected performance but also reflect conditions closer to everyday play contexts. Where participant identifiers are available, mixed-effects or multilevel models would better accommodate potential repeated sessions and nesting (e.g., by headset or event day). The session-level nonparametric tests used in this study cannot represent that structure.
In addition, demographic factors were not collected due to the anonymous nature of the venue. Age, prior VR experience, and gender were therefore unavailable, which constrains interpretation of developmental or demographic effects. The Development Index and clusters should thus be understood as signals of interaction maturity within a mixed-age cohort rather than stratified developmental outcomes. These boundaries do not undermine the existence of the observed profiles but do limit how far they can be generalized to specific groups. Findings should therefore be regarded as exploratory evidence of interaction profiles under free-play VR. Additional controlled laboratory studies with demographic data will be needed to confirm the generality of these results.
Based on these findings, future studies could explore predictive models to anticipate player performance and adapt the difficulty in real time. Early session features (i.e., in the first 30–45 s) could be used to predict top quartile scores, input switch likelihood or even cluster membership. Then, with approaches such as regularized regression we could provide interpretable models, with performance evaluated through ROC-AUC or PR-AUC methods as well as probability calibration methods [61,62,63]. These methods could then inform the in-game DDA systems to provide assistance by activating features such as aim stabilization or dynamic target scaling.
Another complementary research direction would be the use of sequential modeling to detect behavioral transitions during gameplay and estimate when performance is likely to decline. Potential approaches may include hidden Markov models alongside time-to-event formulations for performance drop-off [62,64]. The contextual bandit algorithm could be used to select the most suitable assistance methods during gameplay, while balancing exploration and exploitation under uncertainty [65]. Although such methods require validation through controlled experiments and analysis to determine which group of players would benefit the most from specific adaptations [66]. Finally, future work should consider expanding the player models to incorporate demographic information, as well as anthropomorphic measures or environmental contexts, for more accurate and personalized predictions. Studies focused on deployment should also address issues of fairness and privacy, by using on-device learning solutions, while tracking concept drift across different venues or hardware configurations [67,68].
In conclusion, this study demonstrates that multiple high-performing play styles can coexist in children-focused VR games, even when control preferences differ. Our clustering analysis identified distinct behavioral profiles that varied in precision, tempo, and movement economy, while showing that both trigger and grip-based control styles can support efficient performance. By introducing a composite Development Index, we also provided an exploratory proxy for interaction maturity that aligned with these profiles. Methodologically, we contribute a transparent telemetry-based pipeline and clustering framework that can be reused in future VR learning research. From a design perspective, our findings support maintaining redundant input mappings to accommodate diverse motor abilities and suggest that adaptive support, informed by simple telemetry signals, can assist players without restricting natural play styles. These insights provide practical design recommendations for more inclusive VR games as well as a methodological foundation for further research into predictive adaptation in interactive immersive experiences.

Author Contributions

Conceptualization, M.-A.G. and S.N.; Methodology, M.-A.G.; Software, M.-A.G.; Validation, M.-A.G.; Formal analysis, M.-A.G.; Resources, S.N.; Writing—original draft, M.-A.G.; Writing—review & editing, M.-A.G. and S.N.; Visualization, M.-A.G.; Supervision, S.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

While players did play the game during regular play sessions, the analysis used only anonymized gameplay telemetry data automatically recorded by the game. No personally identifiable information was collected, and the data cannot be traced to individual players. This complies with GDPR provisions regarding anonymized data, and no formal ethics approval was required.

Informed Consent Statement

While players did play the game during regular play sessions, the analysis used only anonymized gameplay telemetry data automatically recorded by the game. No personally identifiable information was collected, and the data cannot be traced to individual players. Informed consent was not needed.

Data Availability Statement

The dataset supporting this study is openly available in our repository [30]. It includes (i) the full set of 151 raw sessions logs in JSON format, (ii) a session-level CSV derived from these logs, (iii) a processed CSV file containing the feature-engineered metrics used in the analysis, and (iv) two Jupyter notebooks with all Python code for feature extraction, clustering, validation, and figure generation. The telemetry schema is documented in the repository.

Acknowledgments

The authors would like to thank University Politehnica of Timisoara (UPT) for its support in facilitating this research. The opportunity to deploy the game during the public fair event was invaluable for gathering free-play, unguided gameplay data, as well as the hardware resources provided, including the four Meta Quest 2 headsets, was essential for executing this study. Additionally, the authors would like to thank the event organizers, the participating children, and their parents or legal guardians, who provided informed consent for participation.

Conflicts of Interest

The authors declare no conflicts of interest. University Politehnica of Timisoara (UPT) provided access to the public fair event for data collection and supplied hardware resources. UPT had no role in the design of the study; in the collection, analysis, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Sanfilippo, F.; Tataru, M.; Hua, M.T.; Johansson, I.J.S.; Andone, D. Gamifying Cultural Immersion: Virtual Reality (VR) and Mixed Reality (MR) in City Heritage. IEEE Trans. Games 2025, 1–20. [Google Scholar] [CrossRef]
  2. Fuentes, E.M.; Varela-Aldás, J.; Palacios-Navarro, G.; García-Magariño, I. Immersive Virtual Reality App to Promote Healthy Eating in Children. In Proceedings of the HCI International 2020—Posters, Copenhagen, Denmark, 19–24 July 2020; Stephanidis, C., Antona, M., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 9–15. [Google Scholar]
  3. Balloufaud, M.; Boujut, A.; Marie, R.; Guinaldo, A.; Fourcade, L.; Hamonet-Torny, J.; Perrochon, A. Augmented Reality Exergames for Upcoming Cognitive-Motor Rehabilitation: User-Centered Design Approach and User Experience of Healthy Children. JMIR Rehabil. Assist. Technol. 2025, 12, e69205. [Google Scholar] [CrossRef]
  4. Erhardsson, M.; Alt Murphy, M.; Sunnerhagen, K.S. Commercial Head-Mounted Display Virtual Reality for Upper Extremity Rehabilitation in Chronic Stroke: A Single-Case Design Study. J. Neuroeng. Rehabil. 2020, 17, 154. [Google Scholar] [CrossRef]
  5. Johnson, C.I.; Fraulini, N.W.; Peterson, E.K.; Entinger, J.; Whitmer, D.E. Exploring Hand Tracking and Controller-Based Interactions in a VR Object Manipulation Task. In Proceedings of the HCI International 2023—Late Breaking Papers, Copenhagen, Denmark, 23–28 July 2023; Chen, J.Y.C., Fragomeni, G., Fang, X., Eds.; Springer Nature: Cham, Switzerland, 2023; pp. 64–81. [Google Scholar]
  6. Chen, W.-C.; Berrezueta-Guzman, S.; Wagner, S. Task-Based Role-Playing VR Game for Supporting Intellectual Disability Therapies. In Proceedings of the 2025 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR), Lisbon, Portugal, 27–29 January 2025. [Google Scholar]
  7. Ijaz, K.; Tran, T.T.M.; Kocaballi, A.B.; Calvo, R.A.; Berkovsky, S.; Ahmadpour, N. Design Considerations for Immersive Virtual Reality Applications for Older Adults: A Scoping Review. Multimodal Technol. Interact. 2022, 6, 60. [Google Scholar] [CrossRef]
  8. Shen, J.; Xiang, H.; Luna, J.; Grishchenko, A.; Patterson, J.; Strouse, R.V.; Roland, M.; Lundine, J.P.; Koterba, C.H.; Lever, K.; et al. Virtual Reality-Based Executive Function Rehabilitation System for Children with Traumatic Brain Injury: Design and Usability Study. JMIR Serious Games 2020, 8, e16947. [Google Scholar] [CrossRef]
  9. Maddalon, L.; Minissi, M.E.; Cervera-Torres, S.; Hervás, A.; Gómez-García, S.; Alcañiz, M. Multimodal Interaction in ASD Children: A Usability Study of a Portable Hybrid VR System. In Proceedings of the Universal Access in Human-Computer Interaction, Copenhagen, Denmark, 23–28 July 2023; Antona, M., Stephanidis, C., Eds.; Springer Nature: Cham, Switzerland, 2023; pp. 614–624. [Google Scholar]
  10. Vlahovic, S.; Suznjevic, M.; Skorin-Kapov, L. A Framework for the Classification and Evaluation of Game Mechanics for Virtual Reality Games. Electronics 2022, 11, 2946. [Google Scholar] [CrossRef]
  11. Johnson-Glenberg, M.C. Immersive VR and Education: Embodied Design Principles That Include Gesture and Hand Controls. Front. Robot. AI 2018, 5, 375272. [Google Scholar] [CrossRef] [PubMed]
  12. Parsons, T.D. Virtual Reality for Enhanced Ecological Validity and Experimental Control in the Clinical, Affective and Social Neurosciences. Front. Hum. Neurosci. 2015, 9, 660. [Google Scholar] [CrossRef]
  13. Pellas, N.; Mystakidis, S.; Christopoulos, A. A Systematic Literature Review on the User Experience Design for Game-Based Interventions via 3D Virtual Worlds in K-12 Education. Multimodal Technol. Interact. 2021, 5, 28. [Google Scholar] [CrossRef]
  14. Alves Fernandes, L.M.; Cruz Matos, G.; Azevedo, D.; Rodrigues Nunes, R.; Paredes, H.; Morgado, L.; Barbosa, L.F.; Martins, P.; Fonseca, B.; Cristóvão, P.; et al. Exploring Educational Immersive Videogames: An Empirical Study with a 3D Multimodal Interaction Prototype. Behav. Inf. Technol. 2016, 35, 907–918. [Google Scholar] [CrossRef]
  15. Stasolla, F.; Curcio, E.; Passaro, A.; Di Gioia, M.; Zullo, A.; Martini, E. Exploring the Combination of Serious Games, Social Interactions, and Virtual Reality in Adolescents with ASD: A Scoping Review. Technologies 2025, 13, 76. [Google Scholar] [CrossRef]
  16. Fowler, L.A.; Vázquez, M.M.; DePietro, B.; Wilfley, D.E.; Fitzsimmons-Craft, E.E. Development, Usability, and Preliminary Efficacy of a Virtual Reality Experience to Promote Healthy Lifestyle Behaviors in Children: Pilot Randomized Controlled Trial. Mhealth 2024, 10, 29. [Google Scholar] [CrossRef]
  17. Dudley, J.; Yin, L.; Garaj, V.; Kristensson, P.O. Inclusive Immersion: A Review of Efforts to Improve Accessibility in Virtual Reality, Augmented Reality and the Metaverse. Virtual Real. 2023, 27, 2989–3020. [Google Scholar] [CrossRef]
  18. Kamińska, D.; Zwoliński, G.; Laska-Leśniewicz, A. Usability Testing of Virtual Reality Applications—The Pilot Study. Sensors 2022, 22, 1342. [Google Scholar] [CrossRef] [PubMed]
  19. Rainer, A.; Setiono, A.; Leonardrich, K.; Ramdhan, D. Development of “Peter’s First-Aid Adventure” Virtual Reality-Based Serious Game in First-Aid Education: Usability Analysis of Virtual Reality Interaction Tools. Procedia Comput. Sci. 2024, 245, 309–319. [Google Scholar] [CrossRef]
  20. Lee, E.-S.; Shin, B.-S. A Flexible Input Mapping System for Next-Generation Virtual Reality Controllers. Electronics 2021, 10, 2149. [Google Scholar] [CrossRef]
  21. Urech, A.; Meier, P.V.; Gut, S.; Duchene, P.; Christ, O. Mapping or No Mapping: The Influence of Controller Interaction Design in an Immersive Virtual Reality Tutorial in Two Different Age Groups. Multimodal Technol. Interact. 2024, 8, 59. [Google Scholar] [CrossRef]
  22. Wentzel, J.; Junuzovic, S.; Devine, J.; Porter, J.; Mott, M. Understanding How People with Limited Mobility Use Multi-Modal Input. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 30 April–5 May 2022; Association for Computing Machinery: New York, NY, USA, 2022; pp. 1–17. [Google Scholar]
  23. Galindo Esparza, R.P.; Dudley, J.J.; Garaj, V.; Kristensson, P.O. Exclusion Rates among Disabled and Older Users of Virtual and Augmented Reality. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 26 April–1 May 2025; Association for Computing Machinery: New York, NY, USA, 2025; pp. 1–15. [Google Scholar]
  24. Skarbez, R.; Brooks, F.P., Jr.; Whitton, M.C. A Survey of Presence and Related Concepts. ACM Comput. Surv. 2017, 50, 96. [Google Scholar] [CrossRef]
  25. Harpstead, E.; MacLellan, C.J.; Aleven, V.; Myers, B.A. Replay Analysis in Open-Ended Educational Games. In Serious Games Analytics: Methodologies for Performance Measurement, Assessment, and Improvement; Loh, C.S., Sheng, Y., Ifenthaler, D., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 381–399. ISBN 978-3-319-05834-4. [Google Scholar]
  26. Drachen, A.; Sifa, R.; Bauckhage, C.; Thurau, C. Guns, Swords and Data: Clustering of Player Behavior in Computer Games in the Wild. In Proceedings of the 2012 IEEE Conference on Computational Intelligence and Games (CIG), Granada, Spain, 12–15 September 2012; pp. 163–170. [Google Scholar]
  27. Drachen, A.; Canossa, A.; Yannakakis, G.N. Player Modeling Using Self-Organization in Tomb Raider: Underworld. In Proceedings of the 2009 IEEE Symposium on Computational Intelligence and Games, Milano, Italy, 7–9 September 2009; pp. 1–8. [Google Scholar]
  28. Gagnon, D.J.; Ponto, K.; Swanson, L.; Tredinnick, R. Demonstrating Replay for Highly Scalable and Cost-Effective User Research of Virtual Reality Learning Games. In Proceedings of the 2025 11th International Conference of the Immersive Learning Research Network (iLRN)—Selected Academic Contributions, Chicago, IL, USA, 15–19 June 2025; pp. 44–50. [Google Scholar] [CrossRef]
  29. Wallner, G.; Kriglstein, S. Comparative Visualization of Player Behavior for Serious Game Analytics. In Serious Games Analytics: Methodologies for Performance Measurement, Assessment, and Improvement; Loh, C.S., Sheng, Y., Ifenthaler, D., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 159–179. ISBN 978-3-319-05834-4. [Google Scholar]
  30. Grosu, M.-A. VR Telemetry Analysis: Notebooks for the Data Analysis Described in this Paper, GitHub Repository. Available online: https://github.com/mihai-alexandru-grosu/vr-telemetry-analysis (accessed on 12 August 2025).
  31. Saredakis, D.; Szpak, A.; Birckhead, B.; Keage, H.A.D.; Rizzo, A.; Loetscher, T. Factors Associated with Virtual Reality Sickness in Head-Mounted Displays: A Systematic Review and Meta-Analysis. Front. Hum. Neurosci. 2020, 14, 96. [Google Scholar] [CrossRef] [PubMed]
  32. Xian, C.; Fu, M. Towards a Taxonomy of Human-Computer Interaction (HCI) Methods Based on a Survey of Recent HCI Researches. In Proceedings of the 2022 IEEE 2nd International Conference on Power, Electronics and Computer Applications (ICPECA), Shenyang, China, 21–23 January 2022; pp. 1192–1196. [Google Scholar]
  33. Katona, J. A Review of Human–Computer Interaction and Virtual Reality Research Fields in Cognitive InfoCommunications. Appl. Sci. 2021, 11, 2646. [Google Scholar] [CrossRef]
  34. Rebenitsch, L.; Owen, C. Review on Cybersickness in Applications and Visual Displays. Virtual Real. 2016, 20, 101–125. [Google Scholar] [CrossRef]
  35. LaViola, J.J. A Discussion of Cybersickness in Virtual Environments. SIGCHI Bull. 2000, 32, 47–56. [Google Scholar] [CrossRef]
  36. Argelaguet, F.; Andujar, C. A Survey of 3D Object Selection Techniques for Virtual Environments. Comput. Graph. 2013, 37, 121–136. [Google Scholar] [CrossRef]
  37. Clifton, J.; Palmisano, S. Effects of Steering Locomotion and Teleporting on Cybersickness and Presence in HMD-Based Virtual Reality. Virtual Real. 2020, 24, 453–468. [Google Scholar] [CrossRef]
  38. Carnevale, A.; Mannocchi, I.; Sassi, M.S.H.; Carli, M.; De Luca, G.; Longo, U.G.; Denaro, V.; Schena, E. Virtual Reality for Shoulder Rehabilitation: Accuracy Evaluation of Oculus Quest 2. Sensors 2022, 22, 5511. [Google Scholar] [CrossRef]
  39. Simon-Martinez, C.; Dos Santos, G.L.; Jaspers, E.; Vanderschueren, R.; Mailleux, L.; Klingels, K.; Ortibus, E.; Desloovere, K.; Feys, H. Age-Related Changes in Upper Limb Motion during Typical Development. PLoS ONE 2018, 13, e0198524. [Google Scholar] [CrossRef] [PubMed]
  40. Smits-Engelsman, B.C.M.; Sugden, D.; Duysens, J. Developmental Trends in Speed Accuracy Trade-off in 6-10-Year-Old Children Performing Rapid Reciprocal and Discrete Aiming Movements. Hum. Mov. Sci. 2006, 25, 37–49. [Google Scholar] [CrossRef]
  41. Forssberg, H.; Kinoshita, H.; Eliasson, A.C.; Johansson, R.S.; Westling, G.; Gordon, A.M. Development of Human Precision Grip. II. Anticipatory Control of Isometric Forces Targeted for Object’s Weight. Exp. Brain Res. 1992, 90, 393–398. [Google Scholar] [CrossRef] [PubMed]
  42. Ehrsson, H.H.; Fagergren, A.; Jonsson, T.; Westling, G.; Johansson, R.S.; Forssberg, H. Cortical Activity in Precision- versus Power-Grip Tasks: An fMRI Study. J. Neurophysiol. 2000, 83, 528–536. [Google Scholar] [CrossRef]
  43. Scikit-Learn. Scikit-Learn: Machine Learning in Python—Documentation. Available online: https://scikit-learn.org/stable/documentation.html (accessed on 12 August 2025).
  44. Wu, M.; Kit, C.Y.; Su, E.L.M.; Yeong, C.F.; Ahmmad, S.N.Z.; Holderbaum, W.; Yang, C. Quantitative Metrics for Evaluating Surgical Dexterity Using Virtual Reality Simulations. PLoS ONE 2025, 20, e0318660. [Google Scholar] [CrossRef] [PubMed]
  45. Sanchez, C.; Urendes, E.; Aceves, A.; Martínez-Olagüe, M.; Raya, R. Fitts’ Law-Based Identification of Motor Development Stages for the Upper Limb: Proof of Concept in Three Age Groups. PeerJ 2025, 13, e19433. [Google Scholar] [CrossRef]
  46. Wobbrock, J.O.; Kane, S.K.; Gajos, K.Z.; Harada, S.; Froehlich, J. Ability-Based Design: Concept, Principles and Examples. ACM Trans. Access. Comput. 2011, 3, 9. [Google Scholar] [CrossRef]
  47. Zohaib, M. Dynamic Difficulty Adjustment (DDA) in Computer Games: A Review. Adv. Hum.-Comput. Interact. 2018, 2018, 5681652. [Google Scholar] [CrossRef]
  48. Darzi, A.; McCrea, S.M.; Novak, D. User Experience with Dynamic Difficulty Adjustment Methods for an Affective Exergame: Comparative Laboratory-Based Study. JMIR Serious Games 2021, 9, e25771. [Google Scholar] [CrossRef] [PubMed]
  49. Alelyani, S.; Tang, J.; Liu, H. Feature Selection for Clustering: A Review. In Data Clustering; Aggarwal, C.C., Reddy, C.K., Eds.; Chapman and Hall/CRC: Boca Raton, FL, USA, 2018; pp. 29–60. ISBN 978-1-315-37351-5. [Google Scholar]
  50. Kruskal, W.H.; Wallis, W.A. Use of Ranks in One-Criterion Variance Analysis. J. Am. Stat. Assoc. 1952, 47, 583–621. [Google Scholar] [CrossRef]
  51. Dunn, O.J. Multiple Comparisons Using Rank Sums. Technometrics 1964, 6, 241–252. [Google Scholar] [CrossRef]
  52. Petersen, G.B.; Petkakis, G.; Makransky, G. A Study of How Immersion and Interactivity Drive VR Learning. Comput. Educ. 2022, 179, 104429. [Google Scholar] [CrossRef]
  53. Kalyuga, S. Expertise Reversal Effect and Its Implications for Learner-Tailored Instruction. Educ. Psychol. Rev. 2007, 19, 509–539. [Google Scholar] [CrossRef]
  54. Makransky, G.; Terkildsen, T.S.; Mayer, R.E. Adding Immersive Virtual Reality to a Science Lab Simulation Causes More Presence but Less Learning. Learn. Instr. 2019, 60, 225–236. [Google Scholar] [CrossRef]
  55. Skene, K.; O’Farrelly, C.M.; Byrne, E.M.; Kirby, N.; Stevens, E.C.; Ramchandani, P.G. Can Guidance during Play Enhance Children’s Learning and Development in Educational Contexts? A Systematic Review and Meta-Analysis. Child Dev. 2022, 93, 1162–1180. [Google Scholar] [CrossRef]
  56. Weisberg, D.S.; Hirsh-Pasek, K.; Golinkoff, R.M.; Kittredge, A.K.; Klahr, D. Guided Play: Principles and Practices. Curr. Dir. Psychol. Sci. 2016, 25, 177–182. [Google Scholar] [CrossRef]
  57. MacKenzie, I.S. Fitts’ Law as a Research and Design Tool in Human-Computer Interaction. Hum.-Comput. Interact. 1992, 7, 91–139. [Google Scholar] [CrossRef]
  58. Fisher, N.; Kulshreshth, A.K. Exploring Dynamic Difficulty Adjustment Methods for Video Games. Virtual Worlds 2024, 3, 230–255. [Google Scholar] [CrossRef]
  59. Clark, L.; Iskandarani, M.E.; Riggs, S. Reaching Interactions in Virtual Reality: The Effect of Movement Direction, Hand Dominance, and Hemispace on the Kinematic Properties of Inward and Outward Reaches. Virtual Real. 2024, 28, 43. [Google Scholar] [CrossRef]
  60. Costa Kohwalter, T.; de Azeredo Figueira, F.M.; de Lima Serdeiro, E.A.; da Silva Junior, J.R.; Gresta Paulino Murta, L.; Walter Gonzalez Clua, E. Understanding Game Sessions through Provenance. Entertain. Comput. 2018, 27, 110–127. [Google Scholar] [CrossRef]
  61. Bonometti, V.; Ringer, C.; Hall, M.; Wade, A.R.; Drachen, A. Modelling Early User-Game Interactions for Joint Estimation of Survival Time and Churn Probability. In Proceedings of the 2019 IEEE Conference on Games (CoG), London, UK, 20–23 August 2019; pp. 1–8. [Google Scholar]
  62. Zou, H.; Hastie, T. Regularization and Variable Selection Via the Elastic Net. J. R. Stat. Soc. Ser. B Stat. Methodol. 2005, 67, 301–320. [Google Scholar] [CrossRef]
  63. Niculescu-Mizil, A.; Caruana, R. Predicting Good Probabilities with Supervised Learning. In Proceedings of the 22nd International Conference on Machine Learning, Bonn, Germany, 7–11 August 2005; Association for Computing Machinery: New York, NY, USA, 2005; pp. 625–632. [Google Scholar]
  64. Bunian, S.; Canossa, A.; Colvin, R.; El-Nasr, M.S. Modeling Individual Differences in Game Behavior Using HMM. In Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, Snowbird, UT, USA, 5–9 October 2017; Volume 13, pp. 158–164. [Google Scholar] [CrossRef]
  65. Slivkins, A. Introduction to Multi-Armed Bandits. arXiv 2024, arXiv:1904.07272. [Google Scholar] [CrossRef]
  66. Künzel, S.R.; Sekhon, J.S.; Bickel, P.J.; Yu, B. Metalearners for Estimating Heterogeneous Treatment Effects Using Machine Learning. Proc. Natl. Acad. Sci. USA 2019, 116, 4156–4165. [Google Scholar] [CrossRef]
  67. Gama, J.; Žliobaitė, I.; Bifet, A.; Pechenizkiy, M.; Bouchachia, A. A Survey on Concept Drift Adaptation. ACM Comput. Surv. 2014, 46, 44. [Google Scholar] [CrossRef]
  68. McMahan, H.B.; Moore, E.; Ramage, D.; Hampson, S.; Arcas, B.A.y. Communication-Efficient Learning of Deep Networks from Decentralized Data. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, Lauderdale, FL, USA, 20–22 April 2017; PMLR: Cambridge, MA, USA, 2017. [Google Scholar]
Figure 1. High-level overview of the VRBloons system and analysis pipeline.
Figure 1. High-level overview of the VRBloons system and analysis pipeline.
Mti 09 00096 g001
Figure 2. VRBloons Gameplay. (a) Game environment and hot-air balloon; (b) first-person view with floating balloons during active play; (c) end-of-session scoreboard at the end of the run.
Figure 2. VRBloons Gameplay. (a) Game environment and hot-air balloon; (b) first-person view with floating balloons during active play; (c) end-of-session scoreboard at the end of the run.
Mti 09 00096 g002
Figure 3. Example of JSON structure generated at the end of each session.
Figure 3. Example of JSON structure generated at the end of each session.
Mti 09 00096 g003
Figure 4. PCA scree plot of the eight-feature matrix.
Figure 4. PCA scree plot of the eight-feature matrix.
Mti 09 00096 g004
Figure 5. Features correlation matrix.
Figure 5. Features correlation matrix.
Mti 09 00096 g005
Figure 6. Accuracy vs. Grip Ratio.
Figure 6. Accuracy vs. Grip Ratio.
Mti 09 00096 g006
Figure 7. Clusters in Two-Dimensional Space using PC1 and PC2 as the main features.
Figure 7. Clusters in Two-Dimensional Space using PC1 and PC2 as the main features.
Mti 09 00096 g007
Figure 8. Clusters visualized using PCA. (a) Three-dimensional PCA projection of the five standardized features. (b) T-SNE embedding of the same five-feature space providing an alternative view of cluster separation.
Figure 8. Clusters visualized using PCA. (a) Three-dimensional PCA projection of the five standardized features. (b) T-SNE embedding of the same five-feature space providing an alternative view of cluster separation.
Mti 09 00096 g008
Figure 9. Comparing development index across clusters.
Figure 9. Comparing development index across clusters.
Mti 09 00096 g009
Table 1. Available balloon types.
Table 1. Available balloon types.
Balloon ColorScore ValueSpawn Weight
Red50.6
Blue100.15
Green250.1
Yellow500.06
Pink1000.04
Black2500.03
White5000.02
Table 2. Telemetry variables recorded during gameplay sessions.
Table 2. Telemetry variables recorded during gameplay sessions.
MetricDescription
playerIDRandomly generated unique identifier for each session
timestampUTC timestamp marking the start of the session
scoreFinal total score earned by the player from all popped balloons
dartsFiredTotal number of darts thrown during the sessions
balloonsPoppedTotal number of balloons successfully hit by the player
accuracyHit accuracy calculated as
balloonsPopped/dartsFired
headMovementTotal movement distance of the HMD during gameplay
leftControllerMoveTotal movement distance of the left controller during the session
rightControllerMoveTotal movement distance of the right controller during the session
lookAroundYawRangeTotal yaw rotation range (left and right) in degrees, capped at 360°
lookAroundPitchRangeTotal pitch rotation range (up and down) in degrees, capped at 180°
trigger_countTotal number of trigger button presses across both controllers
grip_countTotal number of grip button presses across both controllers
Table 3. Summary of feature-engineered metrics.
Table 3. Summary of feature-engineered metrics.
MetricFormulaUnits/BoundsDescription
grip_ratiogrip_count/(grip_count + trigger_count)[0, 1]Relative reliance on the grip button. Values near 1 indicate almost exclusive grip use; values near 0 indicate almost exclusive trigger use.
accuracyballoonsPopped/dartsFired
(logged at runtime)
[0, 1]Hit probability; precision of dart throwing
score_per_throwscore/dartsFiredpoints·throw−1Used to calculate efficiency. Distinguishes players who earn many points with few throws
throws_per_sdartsFired/180
(session duration is fixed at 180 s)
s−1Mean firing tempo. Higher values reflect faster, more active play.
hand_movementrightControllerMovement
(renamed)
metersMaximum cumulative distance traveled by the right- or left-hand controllers, expressed in meters. Indicates gross-motor effort.
head_movementheadMovement
(logged at runtime)
metersTotal headset movement (meters). Captures whole-body sway and position movement.
yaw_rangelookAroundYawRange
(renamed)
degrees [0, 360°)Total horizontal rotation range in degrees capped at 360°.
pitch_rangelookAroundPitchRange
(renamed)
degrees [0, 180°)Total vertical rotation range in degrees capped at 180°.
Note. All metrics are computed per session. Before PCA and k-means, features were winsorized at the 1st/99th percentiles and standardized to z-scores; session duration was fixed at 180 s.
Table 4. Final five-feature matrix.
Table 4. Final five-feature matrix.
FeatureInclusion Reason
grip_ratioCaptures main input preference (trigger vs. grip)
accuracySkill and precision
score_per_throwQuantifies efficiency, not just spamming darts.
throws_per_sTempo and urgency or play
hand_movementTotal physical activity
Table 5. Main stats summary.
Table 5. Main stats summary.
MeanMedianSDMinMaxIQR
score2457.222160.001773.8950.008685.00962.50–3520.00
accuracy0.430.430.160.110.860.31–0.52
score_per_throw14.8214.177.901.0842.478.91–19.63
throws_per_s0.870.940.350.061.560.59–1.15
grip_ratio0.530.530.400.001.000.11–0.98
controller_path64.4159.8440.718.88426.0541.81–80.74
head_movement16.0113.958.872.5954.029.71–19.72
yaw_range353.90359.8731.9994.04360.00359.71–359.94
pitch_range107.81108.9634.7830.42172.2384.14–138.18
Table 6. Cluster profiles (mean ± SD) across k = 3, 4, and 5 solutions.
Table 6. Cluster profiles (mean ± SD) across k = 3, 4, and 5 solutions.
kClusterNGrip_Ratio ± sdAccuracy ± sdScore/Throw ± sdTempo ± sdHand_Movement ± sd
3C0460.08 ± 0.090.56 ± 0.1419.59 ± 7.141.00 ± 0.3170.60 ± 58.71
C1410.94 ± 0.120.50 ± 0.1019.40 ± 6.171.06 ± 0.2554.43 ± 23.20
C2640.60 ± 0.340.29 ± 0.108.46 ± 4.090.66 ± 0.3365.79 ± 32.83
4C0460.09 ± 0.090.54 ± 0.1519.37 ± 7.381.03 ± 0.2771.23 ± 58.40
C1430.69 ± 0.320.27 ± 0.098.03 ± 3.960.79 ± 0.2882.27 ± 25.51
C2400.94 ± 0.120.50 ± 0.1019.27 ± 6.201.08 ± 0.2255.45 ± 22.55
C3220.42 ± 0.320.37 ± 0.1410.47 ± 5.400.32 ± 0.1929.86 ± 14.64
5C0250.12 ± 0.190.65 ± 0.1125.46 ± 6.440.99 ± 0.3263.01 ± 21.98
C1370.97 ± 0.090.50 ± 0.1018.90 ± 4.901.09 ± 0.1952.64 ± 20.40
C2310.15 ± 0.170.40 ± 0.0812.55 ± 4.121.04 ± 0.2283.37 ± 69.48
C3210.43 ± 0.330.37 ± 0.1410.53 ± 5.530.30 ± 0.1629.22 ± 14.68
C4370.76 ± 0.270.26 ± 0.097.89 ± 4.140.75 ± 0.2780.22 ± 25.38
Table 7. Kruskal–Wallis tests (n = 151 sessions).
Table 7. Kruskal–Wallis tests (n = 151 sessions).
MetricHdfp
Score103.933<0.001
Accuracy79.653<0.001
Score per throw78.103<0.001
Throws per second67.163<0.001
Grip ratio98.903<0.001
Hand movement55.363<0.001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Grosu, M.-A.; Nicola, S. Analyzing Player Behavior in a VR Game for Children Using Gameplay Telemetry. Multimodal Technol. Interact. 2025, 9, 96. https://doi.org/10.3390/mti9090096

AMA Style

Grosu M-A, Nicola S. Analyzing Player Behavior in a VR Game for Children Using Gameplay Telemetry. Multimodal Technologies and Interaction. 2025; 9(9):96. https://doi.org/10.3390/mti9090096

Chicago/Turabian Style

Grosu, Mihai-Alexandru, and Stelian Nicola. 2025. "Analyzing Player Behavior in a VR Game for Children Using Gameplay Telemetry" Multimodal Technologies and Interaction 9, no. 9: 96. https://doi.org/10.3390/mti9090096

APA Style

Grosu, M.-A., & Nicola, S. (2025). Analyzing Player Behavior in a VR Game for Children Using Gameplay Telemetry. Multimodal Technologies and Interaction, 9(9), 96. https://doi.org/10.3390/mti9090096

Article Metrics

Back to TopTop