Next Article in Journal
Improved Liquefaction Hazard Assessment via Deep Feature Extraction and Stacked Ensemble Learning on Microtremor Data
Previous Article in Journal
The Relative Age Effect and Performance in Rhythmic Gymnastics: An Analysis of the 2023 Junior and Senior World Championships
Previous Article in Special Issue
Posture Detection of Passengers’ Movement When Boarding and Alighting an Urban Bus: A Pilot Study in Valparaíso, Chile
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Real-Time Cloth Simulation in Extended Reality: Comparative Study Between Unity Cloth Model and Position-Based Dynamics Model with GPU

1
Department of Software Convergence, Soonchunhyang University, Asan 31538, Republic of Korea
2
Department of Computer Science, Soonchunhyang University, Asan 31538, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(12), 6611; https://doi.org/10.3390/app15126611
Submission received: 30 April 2025 / Revised: 30 May 2025 / Accepted: 2 June 2025 / Published: 12 June 2025
(This article belongs to the Special Issue New Insights into Computer Vision and Graphics)

Abstract

This study proposes a GPU-accelerated Position-Based Dynamics (PBD) system for realistic and interactive cloth simulation in Extended Reality (XR) environments, and comprehensively evaluates its performance and functional capabilities on standalone XR devices, such as the Meta Quest 3. To overcome the limitations of traditional CPU-based physics simulations, we designed and optimized highly parallelized algorithms utilizing Unity’s Compute Shader framework. The proposed system achieves real-time performance by implementing efficient collision detection and response handling with complex environmental meshes (RoomMesh) and dynamic hand meshes (HandMesh), as well as capsule colliders based on hand skeleton tracking (OVRSkeleton). Performance evaluations were conducted for both single-sided and double-sided cloth configurations across multiple resolutions. At a 32 × 32 resolution, both configurations maintained stable frame rates of approximately 72 FPS. At a 64 × 64 resolution, the single-sided cloth achieved around 65 FPS, while the double-sided configuration recorded approximately 40 FPS, demonstrating scalable quality adaptation depending on application requirements. Functionally, the GPU-PBD system significantly surpasses Unity’s built-in Cloth component by supporting double-sided cloth rendering, fine-grained constraint control, complex mesh-based collision handling, and real-time interaction with both hand meshes and capsule colliders. These capabilities enable immersive and physically plausible XR experiences, including natural cloth draping, grasping, and deformation behaviors during user interactions. The technical advantages of the proposed system suggest strong applicability in various XR fields, such as virtual clothing fitting, medical training simulations, educational content, and interactive art installations. Future work will focus on extending the framework to general deformable body simulation, incorporating advanced material modeling, self-collision response, and dynamic cutting simulation, thereby enhancing both realism and scalability in XR environments.

1. Introduction

In Extended Reality (XR) environments, realistic interaction with virtual objects is a key element that provides an immersive experience [1]. In particular, the simulation of flexible objects such as clothing, curtains, and flags plays an important role in building realistic virtual environments. However, implementing high-quality cloth simulation with interactive responsiveness is computationally expensive and presents significant technical challenges due to the limited computing resources of XR devices [2]. The built-in Cloth component of the widely used Unity engine provides a simple interface and stable simulation, but there are performance constraints in complex environmental interactions and high-resolution mesh processing. These constraints are particularly pronounced in standalone XR headsets such as the Meta Quest.
Recent advances in soft-body simulation research for XR environments have made significant progress in addressing these computational challenges. Fang et al. [3] conducted a comprehensive survey on improvements to Position-Based Dynamics (PBD), highlighting its growing adoption in XR applications. Their study emphasized the strengths of PBD in terms of efficiency, stability, and controllability—characteristics particularly well-suited for real-time simulation in resource-constrained XR systems. These developments collectively reinforce PBD as a promising approach for achieving high-performance, physically plausible simulations in XR environments.
This study proposes a GPU-accelerated Position-Based Dynamics (PBD) cloth simulation system and compares it with Unity’s default Cloth component. PBD is a method that utilizes position-based constraints for stable and controllable physics simulation, making it suitable for real-time applications [4]. The system developed in this research maximizes parallel processing capabilities by utilizing Unity’s Compute Shader, efficiently processing high-resolution cloth simulation and enhancing interactivity in XR environments [5].
One of the characteristics of XR environments is the integration of physical space and virtual objects. This research implements interactions between environmental meshes and cloth by leveraging this characteristic. Through the spatial awareness capabilities of the Meta Quest, meshes of real environmental objects, such as desks, chairs, and walls, are extracted and utilized in collision processing, allowing virtual cloth to naturally respond to the XR space [6]. Unlike the existing Unity Cloth component, which can only interact with simplified colliders, the proposed system enables accurate collision processing with complex meshes, allowing for realistic behaviors such as virtual cloth being placed on a desk or naturally draping along the edges of real objects.
Additionally, this research focuses on natural interaction between the user’s hands and cloth in XR environments. Using the Meta Quest’s hand tracking system, hand movements are detected in real time, and collision processing between hands and cloth is implemented [7]. Through GPU-based parallel processing algorithms, collision detection and response are efficiently calculated, enabling complex interactions, such as picking up or pressing cloth with fingers. This combination of hand interaction and environmental mesh collision processing provides an immersive experience where virtual cloth responds to users and the environment as if it were a real object.
This paper explains the design and implementation method of the proposed GPU-based simulation system and analyzes the differences in performance, visual quality, and interaction aspects through comparative experiments with the Unity Cloth component. The characteristics of both systems are comprehensively evaluated by measuring computational efficiency, memory usage, scalability according to mesh resolution, and precision of environmental mesh and hand interaction in key benchmark scenarios. The results of this research provide a technical foundation for efficient and realistic cloth simulation in XR environments, suggesting directions for implementing high-quality physics simulation on XR headsets with limited computing resources.

2. Related Work

2.1. Extended Reality (XR) and Mixed Reality (MR) Interaction Systems

Extended Reality (XR) is defined as an umbrella concept encompassing Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), providing new forms of user experiences by blurring the boundaries between reality and virtuality [8,9]. Among these, MR has gained attention for its potential applications in various fields including medicine, education, manufacturing, and cultural content by supporting real-time interactions between real-world structures and virtual objects. The core of XR interaction systems lies in precisely recognizing the user’s position, gaze, and hand gestures in physical space and mapping them to virtual objects to implement physically consistent real-time interactions. To achieve this, technologies such as SLAM (Simultaneous Localization and Mapping) [10], depth sensing, spatial mapping, and hand and gaze tracking are actively utilized, with 3D engines like Unity and Unreal Engine supporting the development of these interactions [11,12]. With the emergence of high-performance XR devices such as Meta Quest 3, Apple Vision Pro, and Microsoft HoloLens 2, XR interaction systems have expanded their research domains beyond simple visual overlays to precise interactions with physical structures and real-time physics simulations [13,14]. XR-based surgical simulations, virtual fitting systems, and smart manufacturing systems are representative application cases reflecting these technological advancements. However, most existing XR systems focus on functional interactions such as object placement, UI manipulation, and gesture-based triggers, while showing relatively limited approaches to real-time simulation and collision processing of deformable soft body objects (e.g., cloth, skin, soft materials) [15]. In particular, deformable objects like cloth require high-performance parallel computing and optimization techniques due to their high computational complexity arising from non-linear deformations and collision processing [16]. This research proposes a system that overcomes these limitations by implementing real-time precise physical interactions between deformable objects such as cloth and scanned real-space meshes in XR environments.

2.2. Meta Quest 3 for XR Physical Simulation

Meta Quest 3 is the latest generation XR device released by Meta, offering standalone operation capabilities along with enhanced MR functionality and high-performance spatial awareness [13]. In particular, it can scan and utilize 3D mesh data of real space in real time through high-precision SLAM functionality based on RGB cameras and depth sensors [17].
Meta Quest 3 provides the Meta XR SDK for integration with the Unity engine, designed to implement various features such as Room Mesh-based spatial mapping, hand tracking, Passthrough AR, and Scene Understanding [18]. It also supports high-performance parallel processing through Compute Shaders, enabling real-time execution of advanced physics-based computations, such as large-scale particle simulations, cloth simulations, and fluid dynamics [5].
In terms of input processing, Meta Quest 3 can integrate controller-based interaction, hand tracking, and gaze tracking via Unity’s Input System, consistently supporting various forms of XR interaction [19].
Meta Quest 3 is, thus, an effective platform for research on real-time collision handling and interactions between scanned real-space meshes (Room Mesh) and deformable virtual objects such as cloth. Accordingly, this research has developed a system that realizes precise physical interactions between cloth and real-world structures by leveraging the Room Mesh functionality and Compute Shader-based simulation capabilities of Meta Quest 3.
An overview of the Meta Quest 3’s external appearance and system architecture is shown in Figure 1.
The detailed hardware specifications of the Meta Quest 3 are summarized in Table 1.

2.3. Position-Based Dynamics for Cloth Simulation

Cloth simulation is one of the core topics that has been extensively researched in the fields of computer graphics and physics-based simulation. In particular, Position-Based Dynamics (PBD) has been widely adopted as a technique providing high stability and efficiency for real-time simulations [20].
Müller et al. [4] first proposed PBD in 2007, presenting an approach that directly adjusts particle positions to satisfy physical constraints, unlike traditional force-based or velocity-based methods. This approach enables stable and fast simulation without the need for complex external force calculations.
PBD is based on the following basic mathematical principles:
  • Prediction Step:
    p i = p i + Δ t · v i + Δ t 2 · f ext ( p i ) m i
  • Constraint Solving:
    Δ p i = s · w i · p i C j ( p 1 , , p n )
  • Position and Velocity Update:
    v i = p i + Δ p i p i Δ t and p i = p i + Δ p i

2.4. GPU-Accelerated Physics Simulation

In XR environments, achieving physically plausible real-time interactions with the physical world requires high computational performance [21]. Particularly, in complex nonlinear systems such as cloth, soft bodies, and fluids, it is necessary to compute the physical states of thousands of particles or vertices within a single frame, making it difficult to achieve real-time performance with CPU-based approaches alone [22,23].
As a result, research on accelerating physics simulations using the massive parallel processing capabilities of GPUs has been actively conducted [24]. GPUs are capable of simultaneously executing thousands of threads, effectively handling the high parallelism and computational load inherent in particle-based simulations. Technologies such as Compute Shader, CUDA, and OpenCL have been utilized to support these high-performance computations.
In the context of cloth simulation, GPU acceleration is typically applied in the following ways:
  • Parallel computation of particle data (position, velocity, external forces);
  • Parallel enforcement of constraints, including distance, bending, and fixed-point constraints;
  • Parallelized collision detection with meshes or colliders;
  • Dynamic optimization of simulation iterations to balance quality and real-time performance.
Previous studies have explored various GPU-based cloth simulation methods. Müller et al. [4] accelerated mass-spring-based cloth simulation on the GPU for character clothing with interactive performance, while Deul et al. implemented a cloth simulation capable of handling thousands of particles using a PBD-based approach with minimal computational delay.
However, most existing works have focused on simulations within closed virtual environments, with relatively few addressing direct physical interactions between scanned real-world meshes and deformable cloth objects in dynamic settings.
To overcome these limitations, this study proposes a GPU-based PBD cloth simulation system built on the Meta Quest 3 platform and Unity’s Compute Shader framework. The proposed system enables real-time physical interactions between scanned real-world spatial meshes and virtual cloth objects, thereby introducing a novel approach for physics-driven XR interaction.

3. Methodology

3.1. System Overview

The overall structure of the proposed GPU-based PBD (Position-Based Dynamics) cloth simulation system for Mixed Reality environments is illustrated in Figure 2. The system is designed considering the characteristics of the Meta Quest platform within the Unity engine environment [25] and consists of four main modules: (1) Data Management Module, (2) GPU-Based Simulation Module, (3) Collision Handling Module, and (4) Rendering and Integration Module.
The Data Management Module manages data such as vertex and topology information of the cloth mesh, physical properties, and constraint settings. It is responsible for efficiently transferring this data between CPU and GPU memory. This module initializes the cloth geometry and constructs the necessary data structures for simulation.
The GPU-Based Simulation Module implements the Position-Based Dynamics algorithm using Compute Shaders, executing the cloth simulation pipeline in parallel. It includes stages such as external force application, velocity update, position prediction, constraint satisfaction, and collision handling. Each stage is implemented as an independent compute kernel to maximize parallel processing efficiency.
The Collision Handling Module manages interactions between the cloth and its environment, handling three types of collisions: (1) collisions with primitive shapes such as spheres and boxes, (2) collisions with complex meshes acquired from real-world scans using spatial mapping, and (3) collisions with user hands through hand tracking [7]. In particular, real-world mesh collision leverages spatial mapping data obtained from Meta Quest’s environmental awareness features, enabling realistic interactions between the virtual cloth and real-world structures like desks and chairs.
The Rendering and Integration Module visualizes simulation results and integrates them into Unity’s rendering system. It handles double-sided rendering, shading, texture mapping, and transforms simulation data to match Unity’s coordinate and transform systems.
The overall workflow of the system is as follows: during initialization, the vertex, triangle, and constraint data structures are generated on the CPU and transferred to GPU memory. In each frame, parallel simulation is performed on the GPU based on the current simulation state and environmental data (e.g., spatial meshes and hand tracking data). Upon completion, simulation results are selectively transferred back to the CPU and integrated into the Unity rendering pipeline.
The main features of the proposed system are summarized as follows. First, the entire simulation process is performed on the GPU, enabling real-time performance even with high-resolution cloth meshes. Second, the system supports precise physical interactions with objects in the real world using Meta Quest’s hand tracking and spatial mapping capabilities. Third, various cloth materials and behaviors can be expressed through user-defined constraints and physical parameters. Fourth, the system is seamlessly integrated with Unity, allowing easy adoption in existing XR applications.
This system design aims to achieve high-quality cloth simulation even on standalone XR headsets with limited computational resources. The next sections provide a detailed description of each module and the core algorithms used.

3.2. GPU-Based PBD Cloth Framework (Reality Collision Version)

The GPU-based PBD (Position-Based Dynamics) cloth simulation framework proposed in this study was implemented using the Compute Shader functionality of the Unity engine. It was designed to enable real-time cloth simulation even on XR devices with limited computational resources, such as the Meta Quest series.
This framework handles external forces, constraints, and collision detection in a parallelized manner. In particular, it supports collision handling with real-world spatial meshes (RoomMesh), thereby enhancing user immersion in XR environments. To balance simulation accuracy and real-time performance, each stage of the simulation—external force application, constraint projection, collision detection and resolution, and final position update—is separated into distinct compute kernels that are executed in parallel on the GPU.

3.2.1. Algorithm Overview

The overall algorithm flow of the developed GPU-based PBD cloth simulation system is described as pseudocode in Algorithm 1. The algorithm is divided into two main parts: the initialization phase and the per-frame simulation loop. Each phase is based on parallel processing through GPU compute kernels.
In particular, the simulation flow includes external force application, position prediction, constraint satisfaction, collision detection with various objects (RoomMesh, Hand Mesh), and final position and velocity updates.
Detailed implementations of the Compute Shader kernel design (Section 3.2.3), RoomMesh-based collision handling (Section 3.2.4), and Hand Collision handling algorithm (Section 3.2.5) are described in the subsequent sections.
Algorithm 1 GPU-Based PBD cloth simulation framework
 1: Initialization Phase:
 2: Create cloth mesh vertices, edges, and constraint data.
 3: Upload vertex data to GPU memory.

 4: Per-Frame Simulation Loop:
 5: Apply external forces to vertices.
 6: Predict vertex positions.
 7: Solve distance and bending constraints.
 8: Detect collisions with RoomMesh and Hand Mesh.
 9: Resolve collisions and update projected positions.
10: Update vertex positions and velocities.
11: Render the cloth mesh.

3.2.2. System Flow Using Flowchart

The overall flow of the GPU-based PBD cloth simulation system can be illustrated as shown in Figure 3. The system is divided into an initialization phase and a simulation phase that repeats every frame.

3.2.3. Compute Shader Kernel Design

To achieve real-time cloth simulation in XR environments with limited computational resources, the proposed system implements the core Position-Based Dynamics (PBD) algorithm using Unity’s Compute Shader functionality. Compute Shaders enable highly parallel processing by utilizing the GPU, allowing thousands of particles to be simultaneously simulated at interactive framerates.
The simulation workflow is decomposed into multiple independent compute kernels, each responsible for a specific stage of the PBD pipeline. This modular design maximizes parallelism, optimizes memory access patterns, and reduces synchronization overhead.
The main Compute Shader kernels designed for the system are summarized below:
  • ApplyExternalForces: Applies external forces such as gravity to the velocities of particles.
  • DampVelocities: Applies damping to particle velocities to simulate energy dissipation.
  • ApplyExplicitEuler: Predicts new particle positions using explicit Euler integration.
  • ProjectConstraintDeltas: Projects particles to satisfy distance and bending constraints, including atomic operations for parallel accumulation.
  • AverageConstraintDeltas: Averages accumulated constraint deltas and updates particle positions accordingly.
  • SatisfySphereCollisions and SatisfyCubeCollisions: Handles collision responses against primitive colliders such as spheres and cubes.
  • SatisfyMeshCollisions: Detects and resolves collisions between cloth particles and RoomMesh triangles.
  • SatisfyHandMeshCollisions: Handles dynamic collision interactions with the user’s hand mesh obtained via Meta Quest OVR Mesh data.
  • UpdatePositions: Updates final particle velocities and positions after constraint satisfaction and collision response.
Each kernel is carefully optimized by selecting appropriate thread group sizes and minimizing conditional branching inside the shader code. Through this highly parallelized architecture, the system achieves stable real-time performance even for moderately high-resolution cloth meshes on standalone XR devices.

3.2.4. RoomMesh-Based Collision Handling Algorithm

To enable realistic physical interactions with real-world structures, the proposed system utilizes RoomMesh data captured by the Meta Quest platform. RoomMesh represents surrounding environments, such as desks, walls, and chairs, using a detailed triangulated mesh.
However, directly performing collision detection against the entire RoomMesh, which typically contains thousands of vertices and triangles, is computationally expensive and impractical for real-time simulation. To address this issue, the system dynamically extracts a SubMesh based on user-defined regions of interest (ROIs).
The user interaction flow is as follows:
  • Press the A button to visualize the complete RoomMesh.
  • Use the right-hand trigger button to select six points, defining the region of interest.
  • Construct an Axis-Aligned Bounding Box (AABB) from the selected points.
  • Extract only the vertices and triangles of the RoomMesh that intersect the AABB to form the SubMesh.
By restricting collision calculations to the SubMesh:
  • Memory usage is significantly reduced.
  • Computational load for collision detection is minimized.
  • Real-time responsiveness of cloth–environment interaction is enhanced.
The process of SubMesh extraction and filtering is summarized in Algorithm 2.
Algorithm 2 SubMesh extraction based on user-defined selection box
Require:  
RoomMesh vertices V, triangles T, selected points { P 1 , P 2 , , P 6 }
Ensure:  
SubMesh vertices V , triangles T
 1:
Compute m i n _ p o i n t and m a x _ p o i n t from the selected points
 2:
Construct an Axis-Aligned Bounding Box (AABB) using m i n _ p o i n t and m a x _ p o i n t
 3:
Initialize empty lists n e w _ v e r t i c e s and n e w _ t r i a n g l e s
 4:
Initialize empty mapping v e r t e x _ m a p p i n g
 5:
for each triangle ( i 0 , i 1 , i 2 ) T  do
 6:
    Transform i 0 , i 1 , i 2 to world coordinates
 7:
    if any vertex w v 0 , w v 1 , or w v 2 is inside AABB then
 8:
        for each i d x { i 0 , i 1 , i 2 }  do
 9:
           if  i d x not in v e r t e x _ m a p p i n g  then
10:
               Add w v _ i d x to n e w _ v e r t i c e s
11:
               Map i d x to new index
12:
           end if
13:
        end for
14:
        Add new triangle using mapped indices
15:
    end if
16:
end for
17:
Create SubMesh from n e w _ v e r t i c e s and n e w _ t r i a n g l e s
18:
Instantiate SubMesh as a GameObject with necessary components

3.2.5. Hand Collision Handling Algorithm

Realistic collision between the virtual cloth and the user’s hand is essential for enhancing immersion in XR environments. The proposed system utilizes both the OVR Skeleton and OVR Mesh functionalities provided by Unity’s Meta Quest SDK to acquire real-time hand joint and mesh data, enabling accurate collision detection and response based on the user’s hand movement.
Two hand collision handling methods are considered in this work: a capsule collider-based method, shown in Algorithm 3, and a triangle mesh-based method using OVR Mesh, shown in Algorithm 4.
Algorithm 3 Hand collision handling for cloth particles (using capsule collider)
 1:
for each cloth particle p i  do
 2:
    for each capsule collider ( pointA , pointB , r )  do
 3:
        Compute vector ab pointB pointA
 4:
        Compute projection factor t clamp ( p i pointA ) · ab ab · ab , 0.0 , 1.0
 5:
        Compute closest point q pointA + t · ab
 6:
        Compute distance d p i q
 7:
        if  d < r  then
 8:
           Compute normal vector n normalize ( p i q )
 9:
           Move particle: p i q + n · ( r + collisionMargin )
10:
           Apply friction to velocity of p i
11:
        end if
12:
    end for
13:
end for
Algorithm 4 Hand collision handling for cloth particles (using OVR mesh)
 1:
for each cloth particle p i  do
 2:
    Initialize m i n D i s t a n c e large value
 3:
    for each triangle ( v 1 , v 2 , v 3 ) from OVR hand mesh do
 4:
        Compute c l o s e s t P o i n t ClosestPointOnTriangle( p i , v 1 , v 2 , v 3 )
 5:
        Compute d i s t a n c e Euclidean distance between p i and c l o s e s t P o i n t
 6:
        if  d i s t a n c e < m i n D i s t a n c e  then
 7:
            m i n D i s t a n c e d i s t a n c e
 8:
            c o l l i s i o n N o r m a l normalized cross product of triangle edges
 9:
        end if
10:
    end for
11:
    if  m i n D i s t a n c e < collision margin then
12:
        Move p i to c l o s e s t P o i n t offset by collisionNormal
13:
        Apply friction coefficient to p i ’s velocity
14:
    end if
15:
end for

3.2.6. Summary

This section described the overall architecture and core implementation details of the GPU-based Position-Based Dynamics (PBD) cloth simulation system. The system leverages Compute Shaders to parallelize the stages of external force application, constraint projection, collision detection and resolution, and final position updates, achieving real-time cloth simulation even in resource-constrained XR environments.
For realistic physical interactions with the real world, RoomMesh data captured by the Meta Quest device was utilized to enable accurate collision handling with scanned structures. To optimize computational efficiency, a SubMesh extraction method based on user-defined selection was introduced, reducing the collision computation range.
Furthermore, dynamic collision detection with the user’s hand mesh, obtained via the OVR Mesh functionality, was implemented to enhance interaction realism.
Through these system designs and optimizations, the proposed framework enables physically plausible and immersive real-time cloth interactions within XR environments.

4. Results

Before presenting detailed experimental results, Table 2 provides a summary of all major experiments conducted in this study. Each experiment is categorized by its objective, implementation setup, and the systems compared. This overview helps contextualize the performance and functional evaluations discussed in subsequent subsections.

4.1. Performance Analysis: Resolution vs. Frame Rate

4.1.1. Resolution-Based Performance Test

To evaluate the real-time performance of the proposed GPU-PBD-based cloth simulation, a comprehensive performance test was conducted by comparing it with Unity’s built-in Cloth component across various cloth resolutions. All experiments were performed in a standalone XR environment using the Meta Quest 3 device.
Performance was measured based on frames per second (FPS) across five different cloth resolutions: 8 × 8, 16 × 16, 32 × 32, 64 × 64, and 128 × 128. Each test involved creating a static cloth mesh at the specified resolution and applying only gravity forces, without any desk collision experiments. FPS was measured using the Meta Quest Developer Hub Performance Analyzer. The numerical results are summarized in Table 3, and a graphical comparison is illustrated in Figure 4.
The results demonstrate that Unity’s Cloth component consistently maintains 72 FPS across all tested resolutions, likely benefiting from advanced internal optimizations. The proposed GPU-PBD system achieves comparable performance up to a 32 × 32 resolution for both single-sided and double-sided configurations. At higher resolutions, the single-sided GPU-PBD maintains relatively high performance (65 FPS at 64 × 64, 42 FPS at 128 × 128), whereas the double-sided GPU-PBD experiences greater performance degradation (40 FPS at 64 × 64, 16 FPS at 128 × 128). Nonetheless, even the double-sided implementation maintains sufficient real-time performance for XR applications at resolutions up to 64 × 64.

4.1.2. GPU Memory Access Analysis

To complement the FPS-based performance analysis, GPU memory access behavior was analyzed across five cloth resolutions (8 × 8 to 128 × 128). We measured the peak GPU memory bandwidth for both read and write operations under the same static cloth simulation conditions used in the FPS test. Measurements were taken using the Meta Quest Developer Hub over a 5 s sampling window, and the maximum observed bandwidth values were recorded.
Figure 5 presents the comparison of memory read and write bandwidth across three systems: Unity Cloth, GPU-PBD (Single-Sided), and GPU-PBD (Double-Sided). As expected, the GPU-PBD systems, especially the double-sided configuration, exhibited significantly higher read bandwidth usage due to increased vertex access and constraint computations. Write bandwidth was less consistent across resolutions, reflecting the asynchronous and conditional nature of constraint projection updates.
Despite higher memory access, the GPU-PBD system remains efficient for real-time XR deployment up to 64 × 64 resolution. These findings reinforce the trade-off between physical realism and GPU resource consumption, offering additional insight into performance scalability.

4.1.3. Functional Capability Comparison

In addition to frame rate analysis, it is important to evaluate the functional capabilities of each system, especially in the context of XR interaction. Table 4 summarizes the feature comparison between Unity Cloth and the GPU-PBD implementations.

4.1.4. Summary and Discussion

The performance and functional evaluation revealed several key findings that provide important insights into the comparative strengths of both cloth simulation systems.
The Unity Cloth system demonstrated consistent performance across all resolution ranges, maintaining a stable 72 FPS throughout all tested configurations. This remarkable performance consistency can be attributed to Unity’s highly optimized internal algorithms and extensive performance tuning developed over many years. The ability to maintain performance without degradation even at high resolutions (128 × 128) represents Unity Cloth’s most significant advantage. Additionally, the system offers excellent development convenience through Unity Editor’s intuitive GUI, enabling developers to implement cloth simulation without complex programming requirements. The relatively low GPU memory usage also contributes to efficient resource management, making it particularly suitable for resource-constrained environments.
In contrast, the proposed GPU-PBD system showed equivalent performance to Unity Cloth up to 32 × 32 resolution, but experienced performance degradation at higher resolutions. However, the true value of the GPU-PBD system lies in its functional capabilities. Advanced features specifically designed for XR environments—including double-sided cloth simulation, real-time hand mesh collision handling, and interaction with real-world environments through RoomMesh—represent unique characteristics not supported by Unity Cloth. Particularly noteworthy is the system’s ability to generate more realistic and rich cloth folding and deformation at higher resolutions, demonstrating superior visual quality.
From a practical perspective in Meta Quest 3 environments, the 32 × 32 resolution provides an optimal balance between visual realism and real-time performance for both systems. The experimental resolution was limited to 128 × 128 because this represents a practical upper bound for real-time cloth simulation in standalone XR devices such as Meta Quest 3. At this resolution, the vertex count reaches 16,384, and the computational load—including constraint projection and collision handling—significantly increases. Beyond this threshold, frame rates drop below acceptable levels for real-time interaction (15 FPS), compromising the XR experience. Therefore, 128 × 128 was selected as the maximum resolution to balance visual realism, computational stability, and real-time responsiveness.
For projects requiring basic cloth simulation where high performance and development efficiency are prioritized, Unity Cloth represents the more suitable choice. Conversely, when natural interaction between users’ hands and real environments in Mixed Reality settings is crucial, and when visual quality and functional extensibility are prioritized, the GPU-PBD system demonstrates clear superiority. The Compute Shader-based architecture of GPU-PBD particularly facilitates future additions of custom physics laws or constraint conditions, making it more suitable for research and advanced development purposes. Meanwhile, the GPU-PBD single-sided mode was introduced specifically for fair performance comparison with Unity Cloth, but for actual XR applications, the double-sided mode is recommended as the standard configuration due to its enhanced functionality.

4.2. Collision Interaction with RoomMesh

4.2.1. SubMesh Extraction and Collision Interaction Process

In this study, to enable precise physical interaction between virtual cloth and real-world surfaces, we utilize the RoomMesh scanning capabilities of Meta Quest 3. The user selects four points using the controller, based on which an Axis-Aligned Bounding Box (AABB) is generated. Only the RoomMesh triangles contained within the AABB are filtered to create a SubMesh, which is used as a collision surface for cloth simulation experiments.
This SubMesh extraction method reduces computational overhead by focusing collision processing only on necessary areas, thus ensuring real-time performance. The overall process of SubMesh extraction and collision interaction is illustrated in Figure 6.
The overall SubMesh extraction and collision process consists of the following four steps:
  • Step 1: RoomMesh Acquisition
    The environment is scanned using Meta Quest 3 to acquire a full RoomMesh.
  • Step 2: Region Selection (White Pointer)
    The user defines the target region by selecting six points with the controller, where a white pointer visually indicates the selected points.
  • Step 3: SubMesh Generation and Collision Position Display
    Based on the selected points, an AABB is generated, and RoomMesh triangles within the AABB are extracted to form a SubMesh. After dropping the virtual cloth onto the SubMesh, the final collision position is displayed and held on screen for approximately 1 s.
  • Step 4: Cloth Collision Experiment
    The SubMesh is instantiated as the collision surface, and the cloth simulation experiment is performed.

4.2.2. Experimental Setup

Table 5 summarizes the experimental settings for evaluating the collision interactions between cloth and SubMesh.

4.2.3. Collision Processing Results: Unity Cloth vs. GPU-PBD

To evaluate the collision handling performance, we compared Unity’s built-in Cloth component and the proposed GPU-PBD Cloth system under identical experimental conditions. The results are shown in Figure 7, providing a direct 1:1 visual comparison.
  • Unity Cloth: The cloth fails to properly detect the desk surface, resulting in the cloth passing through it.
  • GPU-PBD Cloth (Double-Sided): The cloth accurately detects and interacts with the SubMesh surface, resting naturally on the desk.
These results demonstrate that the GPU-PBD system enables more accurate and physically plausible interaction between virtual cloth and real-world surfaces in XR environments, overcoming the limitations of Unity’s default cloth simulation.

4.3. Collision Setup Using OVRSkeleton Capsule Colliders

In this study, realistic interactions between the virtual cloth and the user’s hand were implemented by utilizing the hand tracking system provided by Meta Quest 3, specifically the OVRSkeleton component. OVRSkeleton provides real-time tracking of 19 key joints in the hand, and each joint is associated with a Capsule Collider that can be used for physical collision handling in Unity.
In the experimental setup, 19 Capsule Colliders were attached to the respective joint positions, and these were used as the collision targets for both Unity’s built-in Cloth system and the proposed GPU-based PBD Cloth system. The user’s hand moved from the right side of the virtual cloth to the left, causing continuous sweeping collisions across the cloth surface.

4.3.1. Experimental Setup

To evaluate the effectiveness of capsule-based hand–cloth collision, we conducted controlled simulation experiments using Unity capsule colliders and OVR Skeleton-based hand tracking. The detailed experimental configuration is summarized in Table 6.

4.3.2. Collision Response Comparison

Figure 8 and Figure 9 show the collision behavior of the Unity Cloth and GPU-PBD Cloth systems, respectively, under identical hand interaction conditions.
(a)
Unity Cloth:
While Unity’s built-in Cloth component supports collision with Capsule Colliders, it showed instability under continuous interaction. During the experiment, the hand’s sweeping motion caused previously deformed cloth particles to snap back to their original positions once the collider passed, resulting in unnatural popping or sudden restoration effects. This disrupts the perception of physical continuity and realism, as shown in Figure 8.
(b)
GPU-PBD Cloth:
In contrast, the proposed GPU-PBD system handled the same Capsule Colliders using a compute-shader-based parallel collision kernel. The cloth particles responded stably throughout the hand movement, and their positions were preserved post-collision due to constraint projection and friction handling. As shown in Figure 9, this resulted in visually smooth, physically plausible deformations without noticeable popping artifacts.

4.3.3. Conclusions

This experiment demonstrates that while Unity Cloth allows for basic capsule-based collisions, it suffers from instability in maintaining constraint satisfaction during sequential interactions, leading to unrealistic cloth responses. In particular, deformation artifacts such as popping and visible triangle meshes become apparent during continuous hand movement, which disrupts the visual continuity and realism.
In contrast, the proposed GPU-PBD Cloth system exhibits robust and stable collision behavior by leveraging a compute-shader-based parallel collision kernel. It maintains physically plausible deformations even under continuous sweeping interactions, enabling more immersive and realistic hand–cloth interactions in XR environments.
Moreover, as shown in Figure 10, the GPU-PBD system significantly improves visual fidelity. While Unity Cloth reveals rigid triangle structures during interaction (left), GPU-PBD reproduces smooth, natural wrinkles and folds (right), even under the same resolution settings. Despite a moderate reduction in frame rate at higher resolutions, this system offers a clear advantage in visual immersion and realism, highlighting its suitability for high-quality real-time XR simulations.

4.4. Advanced Hand Mesh Collision Interaction

4.4.1. Real-Time Hand Mesh-Based Collision Implementation

To enable physically realistic interactions between the virtual cloth and the user’s hands, this study utilizes the real-time hand mesh acquisition functionality provided by the OVRMesh component of Meta Quest 3. The OVRMesh provides a dynamically updated high-resolution 3D structure of the user’s hands, including detailed representations of fingers and palms, allowing fine-grained collision detection.

4.4.2. Functional Limitations of Unity Cloth Component

Unity’s built-in Cloth component only supports collision with simple primitive colliders such as spheres and capsules. As a result, it cannot realistically interact with complex hand geometries, nor does it support real-time hand mesh collision detection. This fundamentally limits its ability to create immersive and natural hand–cloth interactions.

4.4.3. Real-Time Hand Mesh-Based Interaction with GPU-PBD System

The proposed GPU-PBD Cloth system leverages real-time hand mesh updates as collision surfaces to achieve advanced interactions that are not possible with Unity’s default system. Two major hand–cloth interaction scenarios were implemented:
Right-Hand Push Interaction: The user pushes the cloth from right to left using their right hand, deforming the cloth naturally during the motion. Figure 11 illustrates the result of the right-hand push interaction, where the user’s hand mesh deforms the cloth in real time.
Two-Hand Stretching Interaction: The user grasps the cloth with both hands and pulls outward in opposite directions, stretching the cloth surface. Figure 12 presents a two-hand stretching interaction, comparing a real-world photo with the corresponding Unity simulation.
These results demonstrate that the GPU-PBD system can faithfully reflect the complex shapes of the user’s hands, enabling physically plausible and immersive cloth deformations in real time within XR environments.

4.4.4. Experimental Setup

The experimental conditions for evaluating the hand mesh-based collision interactions are summarized in Table 7.

4.4.5. Summary and Discussion

  • Unity Cloth: Limited to primitive colliders, unable to handle natural interactions with complex hand geometries.
  • GPU-PBD Cloth: By leveraging real-time hand mesh collision, the system enables realistic cloth deformations through advanced interactions such as pushing and stretching.
These experimental results confirm that the GPU-PBD system offers superior functional capabilities compared to Unity’s default cloth simulation, particularly in achieving immersive and physically realistic user–cloth interactions in XR environments.
However, a notable limitation was observed during high-speed two-hand pulling interactions using real-time hand mesh input. When users rapidly pulled the cloth outward using both hands, the simulation occasionally produced unnatural tearing or distortion effects. This issue is visually illustrated in Figure 13, where the cloth undergoes excessive stretching and tearing due to instability during rapid two-hand interaction using the real-time hand mesh. Such artifacts highlight the need for further refinement in the constraint projection algorithm and temporal integration to maintain stability under high-tension scenarios.

4.5. Quantitative User Evaluation

To evaluate the perceptual realism and interaction quality of the cloth simulation systems, a quantitative user study was conducted comparing Unity’s built-in Cloth system and the proposed GPU-PBD Cloth system. Each participant interacted with both systems using hand-based interaction scenarios with capsule colliders on the Meta Quest 3 device.
A total of 10 participants (6 male, 4 female, aged 20–33), all of whom had prior experience with XR devices, completed the evaluation. After performing the tasks, participants rated both systems based on five criteria using a five-point Likert scale (1 = Strongly Disagree, 5 = Strongly Agree). The evaluation criteria were:
  • Visual Realism: “The appearance of the cloth, including folds and wrinkles, looked similar to real fabric.”
  • Interaction Naturalness: “The cloth responded physically naturally when touched, pushed, or stretched.”
  • Responsiveness: “The cloth responded promptly to my hand movements without noticeable delay.”
  • Spatial Alignment: “The cloth correctly interacted with real-world surfaces such as desks and floors without misalignment.”
  • Immersive Experience: “The simulation felt immersive and enhanced the overall realism of the XR environment.”
Figure 14 presents the average scores across these criteria. The results show that the GPU-PBD system significantly outperformed Unity Cloth in Visual Realism (4.6 vs. 3.1) and Interaction Naturalness (4.8 vs. 3.0), demonstrating its superior physical fidelity and natural behavior in user interactions.
Although the two systems showed comparable performance in Responsiveness (4.2 vs. 3.4), the GPU-PBD system received a substantially higher score in Spatial Alignment (4.6 vs. 1.1), clearly indicating Unity Cloth’s limitations in accurately responding to complex real-world geometries such as furniture edges and mesh surfaces. In addition, the GPU-PBD system also surpassed Unity Cloth in Immersive Experience (4.8 vs. 2.6), reflecting a stronger overall sense of realism and engagement perceived by users.
These results reinforce the technical findings presented in earlier sections and further highlight the perceptual and experiential advantages of the proposed GPU-PBD cloth simulation system in XR environments.

5. Results and Conclusions

This study proposed a GPU-accelerated Position-Based Dynamics (PBD) cloth simulation system for real-time use in XR environments and conducted a comprehensive comparison with Unity’s built-in Cloth component in terms of performance and functional capabilities.
Performance evaluations across various resolutions (8 × 8 to 128 × 128) showed that Unity Cloth maintained a consistent 72 FPS in all scenarios, demonstrating its strong internal optimization. This stable frame rate makes Unity Cloth highly suitable for lightweight XR applications or rapid prototyping scenarios where simple interactions and consistent runtime behavior are prioritized. In comparison, the GPU-PBD system sustained equivalent 72 FPS performance up to a 32 × 32 resolution. At 64 × 64, it maintained 65 FPS for the single-sided setup and 40 FPS for the double-sided version. At 128 × 128, performance dropped to 42 FPS (single-sided) and 16 FPS (double-sided). Despite the decline at higher resolutions, the GPU-PBD system preserved real-time interactivity up to 64 × 64 in double-sided mode, which is generally sufficient for interactive XR tasks.
Functionally, Unity Cloth is limited to single-sided rendering and supports only simple primitive colliders. The GPU-PBD system, by contrast, supports two-sided cloth behavior, collision with scanned RoomMesh surfaces, and real-time interaction with OVRMesh-based hand models. It offers fine-grained control of distance and bending constraints and is highly extensible due to its modular Compute Shader architecture.
In physical interaction tests, GPU-PBD cloth objects realistically draped and responded to real-world structures such as desks, while Unity Cloth often displayed unnatural behavior such as mesh penetration or inaccurate folding. Similarly, in hand interaction experiments using both OVRMesh and OVRSkeleton capsule colliders, the GPU-PBD system provided stable and physically plausible responses, whereas Unity Cloth frequently produced popping artifacts or failed to detect complex hand geometry.
Quantitative user evaluation further supported these findings. In a 10-participant study using 5 evaluation metrics—Visual Realism, Interaction Naturalness, Responsiveness, Spatial Alignment, and Immersive Experience—the GPU-PBD system received higher ratings in all categories. Participants especially noted its realistic response to folding, wrinkling, and surface sliding during hand interactions, in contrast to Unity Cloth, which was perceived as physically limited under similar conditions.
In conclusion, while Unity Cloth remains a viable solution for simple, low-complexity XR scenarios due to its consistent frame rate and ease of integration, the GPU-PBD system demonstrates clear advantages in handling complex environments, realistic interactions, and immersive simulation experiences. Designed with portability in mind, the framework is suitable for adaptation to other XR devices such as Apple Vision Pro and Microsoft HoloLens 2. Future research will explore the expansion of the system beyond cloth to include general deformable and cutting simulations based on the PBD framework, enabling advanced applications in medical training, anatomical education, and emergency response simulations.

Author Contributions

Conceptualization, M.H.; Methodology, T.K.; Software, T.K.; Formal analysis, J.M. and T.K.; Writing—original draft preparation, T.K.; Writing—review and editing, T.K., M.H. and J.M.; Project administration, M.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF), funded by the Ministry of Education (NRF-2022R1I1A3069371), funded by BK21 FOUR (Fostering Outstanding Universities for Research) (No.: 5199990914048), and supported by the Soonchunhyang University Research Fund.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The funders had no role in the design of this study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Rauschnabel, P.A.; Felix, R.; Hinsch, C.; Shahab, H.; Alt, F. What is XR? Towards a Framework for Augmented and Virtual Reality. Comput. Hum. Behav. 2022, 133, 107289. [Google Scholar] [CrossRef]
  2. Unity Manual: Cloth Component. Available online: https://docs.unity3d.com/Manual/class-Cloth.html (accessed on 26 August 2024).
  3. Fang, J.; You, L.; Chaudhry, E.; Zhang, J. State-of-the-art improvements and applications of position based dynamics. Comput. Animat. Virtual Worlds 2023, 34, e2143. [Google Scholar] [CrossRef]
  4. Müller, M.; Heidelberger, B.; Hennix, M.; Ratcliff, J. Position based dynamics. J. Vis. Commun. Image Represent. 2007, 18, 109–118. [Google Scholar] [CrossRef]
  5. Introduction to Compute Shaders in Unity. Available online: https://docs.unity3d.com/Manual/class-ComputeShader.html (accessed on 16 September 2024).
  6. OVRSceneManager. Available online: https://developers.meta.com/horizon/documentation/unity/unity-scene-use-scene-anchors (accessed on 26 April 2024).
  7. Meta Quest-Hand Tracking Documentation. Available online: https://developer.oculus.com/documentation/unity/unity-handtracking (accessed on 26 April 2025).
  8. Milgram, P.; Kishino, F. A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
  9. Azuma, R.T. A survey of augmented reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
  10. Cadena, C.; Carlone, L.; Carrillo, H.; Latif, Y.; Scaramuzza, D.; Neira, J.; Reid, I.; Leonard, J.J. Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Trans. Robot. 2016, 32, 1309–1332. [Google Scholar] [CrossRef]
  11. Unity XR Development Documentation. Available online: https://docs.unity3d.com/Manual/XR.html (accessed on 26 March 2025).
  12. Unreal Engine XR Development Guide. Available online: https://docs.unrealengine.com/en-US/SharingAndReleasing/XRDevelopment/index.html (accessed on 26 December 2024).
  13. Meta Platforms. Meta Quest 3 Specifications. 2023. Available online: https://www.meta.com/quest/quest-3/specs/ (accessed on 27 August 2024).
  14. Makhataeva, Z.; Varol, H.A. Microsoft HoloLens 2: New Possibilities for Augmented Reality. IEEE Access 2020, 8, 125859–125870. [Google Scholar]
  15. Sung, N.J.; Ma, J.; Hor, K.; Kim, T.; Va, H.; Choi, Y.J.; Hong, M. Real-Time Physics Simulation Method for XR Application. Computers 2025, 14, 17. [Google Scholar] [CrossRef]
  16. Volino, P.; Magnenat-Thalmann, N. Accurate garment prototyping and simulation. Comput.-Aided Des. 2010, 42, 778–788. [Google Scholar] [CrossRef]
  17. Meta Quest 3 Spatial Awareness and SLAM Technology Overview. Available online: https://developer.meta.com/docs/quest/room-mesh/ (accessed on 27 April 2025).
  18. Meta XR SDK Documentation. Available online: https://developers.meta.com/horizon/downloads/package/oculus-platform-sdk/ (accessed on 27 April 2025).
  19. Unity Input System Documentation. Available online: https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/index.html (accessed on 27 April 2025).
  20. Ma, J.; Sung, N.J.; Choi, M.H.; Hong, M. Performance Comparison of Vertex Block Descent and Position Based Dynamics Algorithms Using Cloth Simulation in Unity. Appl. Sci. 2024, 14, 11072. [Google Scholar] [CrossRef]
  21. Han, D.; Lee, J.; Kim, M.; Lee, Y. XRMan: Towards Real-time Hand-Object Pose Tracking in eXtended Reality. In Proceedings of the 30th Annual International Conference on Mobile Computing and Networking, ACM MobiCom ’24, Washington, DC, USA, 18–22 November 2024; pp. 1575–1577. [Google Scholar] [CrossRef]
  22. Müller, M.; Heidelberger, B.; Teschner, M.; Gross, M. Meshless deformations based on shape matching. ACM Trans. Graph. (TOG) 2005, 24, 471–478. [Google Scholar] [CrossRef]
  23. Macklin, M.; Müller, M.; Chentanez, N.; Kim, T.Y. Unified particle physics for real-time applications. ACM Trans. Graph. (TOG) 2014, 33, 153. [Google Scholar] [CrossRef]
  24. Va, H.; Choi, M.H.; Hong, M. Real-Time Cloth Simulation Using Compute Shader in Unity3D for AR/VR Contents. Appl. Sci. 2021, 11, 8255. [Google Scholar] [CrossRef]
  25. Bondarenko, V.; Zhang, J.; Nguyen, G.T.; Fitzek, F.H.P. A Universal Method for Performance Assessment of Meta Quest XR Devices. In Proceedings of the 2024 IEEE Gaming, Entertainment, and Media Conference (GEM), Turin, Italy, 5–7 June 2024; pp. 1–6. [Google Scholar] [CrossRef]
Figure 1. Meta Quest 3 external appearance and system overview.
Figure 1. Meta Quest 3 external appearance and system overview.
Applsci 15 06611 g001
Figure 2. GPU-based PBD cloth simulation system architecture.
Figure 2. GPU-based PBD cloth simulation system architecture.
Applsci 15 06611 g002
Figure 3. Flowchart of the GPU-based PBD cloth simulation system.
Figure 3. Flowchart of the GPU-based PBD cloth simulation system.
Applsci 15 06611 g003
Figure 4. FPS performance comparison across different cloth resolutions.
Figure 4. FPS performance comparison across different cloth resolutions.
Applsci 15 06611 g004
Figure 5. Peak GPU memory bandwidth usage (Read/Write) across different cloth resolutions for Unity Cloth and GPU-PBD systems.
Figure 5. Peak GPU memory bandwidth usage (Read/Write) across different cloth resolutions for Unity Cloth and GPU-PBD systems.
Applsci 15 06611 g005
Figure 6. Flowchart of SubMesh extraction and collision interaction process.
Figure 6. Flowchart of SubMesh extraction and collision interaction process.
Applsci 15 06611 g006
Figure 7. Comparison of collision responses between Unity Cloth and GPU-PBD Cloth. Images (ac) show Unity Cloth responses from front, top, and corner-drop views; (df) show corresponding results from the GPU-PBD Cloth system. Cloth was dropped vertically onto a flat surface and diagonally onto the desk edge to evaluate physical interaction accuracy.
Figure 7. Comparison of collision responses between Unity Cloth and GPU-PBD Cloth. Images (ac) show Unity Cloth responses from front, top, and corner-drop views; (df) show corresponding results from the GPU-PBD Cloth system. Cloth was dropped vertically onto a flat surface and diagonally onto the desk edge to evaluate physical interaction accuracy.
Applsci 15 06611 g007
Figure 8. Collision response of Unity Cloth using 19 capsule colliders attached to the hand joints. Popping artifacts can be observed where cloth particles snap back after collision.
Figure 8. Collision response of Unity Cloth using 19 capsule colliders attached to the hand joints. Popping artifacts can be observed where cloth particles snap back after collision.
Applsci 15 06611 g008
Figure 9. Collision response of GPU-PBD Cloth using 19 capsule colliders attached to the hand joints. The cloth maintains physically plausible deformations after collision.
Figure 9. Collision response of GPU-PBD Cloth using 19 capsule colliders attached to the hand joints. The cloth maintains physically plausible deformations after collision.
Applsci 15 06611 g009
Figure 10. Visual comparison between Unity Cloth ((left), red circle) and GPU-PBD Cloth ((right), blue circle). Unity Cloth reveals mesh triangle structure and unnatural deformation, whereas GPU-PBD simulates smooth, realistic folds with physically plausible contact.
Figure 10. Visual comparison between Unity Cloth ((left), red circle) and GPU-PBD Cloth ((right), blue circle). Unity Cloth reveals mesh triangle structure and unnatural deformation, whereas GPU-PBD simulates smooth, realistic folds with physically plausible contact.
Applsci 15 06611 g010
Figure 11. GPU-PBD Cloth: Pushing the cloth from right to left using the right hand mesh.
Figure 11. GPU-PBD Cloth: Pushing the cloth from right to left using the right hand mesh.
Applsci 15 06611 g011
Figure 12. GPU-PBD Cloth Stretching Interaction: (Left) Real-world photo of stretching the cloth with both hands; (Right) Unity scene visualization showing real-time collision between the hand mesh and the virtual cloth.
Figure 12. GPU-PBD Cloth Stretching Interaction: (Left) Real-world photo of stretching the cloth with both hands; (Right) Unity scene visualization showing real-time collision between the hand mesh and the virtual cloth.
Applsci 15 06611 g012
Figure 13. Limitation observed during high-speed two-hand interaction using hand mesh input. From left to right: (1) cloth begins to stretch under dual-hand control; (2) excessive stretching results in unnatural deformation; (3) tearing artifacts occur due to instability in constraint resolution and insufficient temporal granularity.
Figure 13. Limitation observed during high-speed two-hand interaction using hand mesh input. From left to right: (1) cloth begins to stretch under dual-hand control; (2) excessive stretching results in unnatural deformation; (3) tearing artifacts occur due to instability in constraint resolution and insufficient temporal granularity.
Applsci 15 06611 g013
Figure 14. Quantitative user evaluation comparing Unity Cloth and GPU-PBD systems. Scores are averaged from 10 participants across 5 criteria, using a 5-point Likert scale.
Figure 14. Quantitative user evaluation comparing Unity Cloth and GPU-PBD systems. Scores are averaged from 10 participants across 5 criteria, using a 5-point Likert scale.
Applsci 15 06611 g014
Table 1. Meta Quest 3 detailed specifications.
Table 1. Meta Quest 3 detailed specifications.
ItemSpecification
ProcessorQualcomm Snapdragon XR2 Gen 2
Memory8GB LPDDR5 SDRAM
Internal Storage512GB UFS 3.1
Display2 × 2064 × 2208 Fast-switch LCD, 90–120 Hz refresh rate
Lens TypePancake lens
Field of View (FOV)110° horizontal, 96° vertical
Interpupillary Distance (IPD)58–70 mm (4-stage adjustment)
Eye Relief4-stage adjustment
Tracking6 Degrees of Freedom (6DoF)
Cameras2 external RGB cameras, 4 wide-angle IR cameras
ConnectivityWi-Fi 6E, Bluetooth 5.3
BatteryBuilt-in 5060 mAh Li-ion
Operating SystemMeta Horizon OS
Headset Dimensions260 × 98 × 192 mm, 515 g
Controller Dimensions130 × 70 × 62 mm, 103 g
PortsUSB Type-C × 1, 3.5 mm Audio Jack × 1
ManufacturerMeta Platforms, Menlo Park, CA, USA
Table 2. Overview of experimental scenarios: (1) resolution vs. frame rate, (2) RoomMesh collision, (3) hand capsule collider collision, (4) hand mesh collider collision.
Table 2. Overview of experimental scenarios: (1) resolution vs. frame rate, (2) RoomMesh collision, (3) hand capsule collider collision, (4) hand mesh collider collision.
No.PurposeTest SetupCompared Systems
1Measure FPS performance across resolutions (8 × 8 to 128 × 128)Gravity-only cloth drop in static space using Meta Quest 3Unity Cloth vs. GPU-PBD (Single & Double-Sided)
2Test physical interaction accuracy between cloth and real-world surfaceSubMesh from controller-selected RoomMesh regionUnity Cloth vs. GPU-PBD (Double-Sided)
3Evaluate hand interaction realism using 19 capsule collidersHand sweeps across cloth using capsule collidersUnity Cloth vs. GPU-PBD (Double-Sided)
4Evaluate natural deformation via hand mesh geometryReal-time hand mesh in push and two-hand stretch interactionsGPU-PBD (Double-Sided)
Table 3. Performance comparison by resolution: average FPS over five trials (single and double).
Table 3. Performance comparison by resolution: average FPS over five trials (single and double).
ResolutionUnity Cloth (FPS)GPU-PBD (Single-Sided)GPU-PBD (Double-Sided)
8 × 872.072.072.0
16 × 1672.072.072.0
32 × 3272.072.072.0
64 × 6472.065.040.0
128 × 12872.042.016.0
Table 4. Feature comparison between unity cloth and GPU-PBD cloth (single-sided vs. double-sided).
Table 4. Feature comparison between unity cloth and GPU-PBD cloth (single-sided vs. double-sided).
FeatureUnity ClothGPU-PBD (Single-Sided)GPU-PBD (Double-Sided)
Double-Sided Cloth SupportXXO
Hand Mesh Collision HandlingXXO
RoomMesh (Reality Mesh) Collision HandlingXXO
Fine-Grained Constraint AdjustmentLimitedOO
Compute Shader ExtensibilityXOO
Real-Time Performance (Up to 32 × 32)O (72 FPS)O (72 FPS)O (72 FPS)
Real-Time Performance (64 × 64 and above)O (72 FPS)(65 FPS at 64 × 64)(40 FPS at 64 × 64)
Note: O = supported, X = not supported.
Table 5. Experimental setup for RoomMesh collision testing.
Table 5. Experimental setup for RoomMesh collision testing.
ItemDetails
Cloth Resolution32 × 32
Collision SurfaceExtracted SubMesh (desk surface)
Drop HeightApproximately 30 cm above the desk
Physical ParametersGravity, mass, stiffness (identical settings for both systems)
Table 6. Experimental setup for capsule-based hand–cloth collision.
Table 6. Experimental setup for capsule-based hand–cloth collision.
ItemDetails
Cloth Resolution32 × 32
Hand Tracking MethodOVRSkeleton (Meta Quest 3)
Collision Targets19 Unity Capsule Colliders (attached to each joint)
Interaction ScenarioHand moves across the cloth from right to left
Table 7. Experimental setup for hand mesh collision testing.
Table 7. Experimental setup for hand mesh collision testing.
ItemDetails
Cloth Resolution32 × 32
Hand Tracking MethodMeta Quest 3 OVRMesh (Real-Time Update)
Collision SurfaceDynamically updated hand mesh
Interaction ScenariosRight-hand push (moving from right to left), two-hand stretching (pulling cloth in opposite directions)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kim, T.; Ma, J.; Hong, M. Real-Time Cloth Simulation in Extended Reality: Comparative Study Between Unity Cloth Model and Position-Based Dynamics Model with GPU. Appl. Sci. 2025, 15, 6611. https://doi.org/10.3390/app15126611

AMA Style

Kim T, Ma J, Hong M. Real-Time Cloth Simulation in Extended Reality: Comparative Study Between Unity Cloth Model and Position-Based Dynamics Model with GPU. Applied Sciences. 2025; 15(12):6611. https://doi.org/10.3390/app15126611

Chicago/Turabian Style

Kim, Taeheon, Jun Ma, and Min Hong. 2025. "Real-Time Cloth Simulation in Extended Reality: Comparative Study Between Unity Cloth Model and Position-Based Dynamics Model with GPU" Applied Sciences 15, no. 12: 6611. https://doi.org/10.3390/app15126611

APA Style

Kim, T., Ma, J., & Hong, M. (2025). Real-Time Cloth Simulation in Extended Reality: Comparative Study Between Unity Cloth Model and Position-Based Dynamics Model with GPU. Applied Sciences, 15(12), 6611. https://doi.org/10.3390/app15126611

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop