Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

Search Results (194)

Search Parameters:
Keywords = virtual reality (VR) space

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 4899 KiB  
Article
Material Perception in Virtual Environments: Impacts on Thermal Perception, Emotions, and Functionality in Industrial Renovation
by Long He, Minjia Wu, Yue Ma, Di Cui, Yongjiang Wu and Yang Wei
Buildings 2025, 15(15), 2698; https://doi.org/10.3390/buildings15152698 - 31 Jul 2025
Viewed by 236
Abstract
Industrial building renovation is a sustainable strategy to preserve urban heritage while meeting modern needs. However, how interior material scenes affect users’ emotions, thermal perception, and functional preferences remains underexplored in adaptive reuse contexts. This study used virtual reality (VR) to examine four [...] Read more.
Industrial building renovation is a sustainable strategy to preserve urban heritage while meeting modern needs. However, how interior material scenes affect users’ emotions, thermal perception, and functional preferences remains underexplored in adaptive reuse contexts. This study used virtual reality (VR) to examine four common material scenes—wood, concrete, red brick, and white-painted surfaces—within industrial renovation settings. A total of 159 participants experienced four Lumion-rendered VR environments and rated them on thermal perception (visual warmth, thermal sensation, comfort), emotional response (arousal, pleasure, restoration), and functional preference. Data were analyzed using repeated measures ANOVA and Pearson correlation. Wood and red brick scenes were associated with warm visuals; wood scenes received the highest ratings for thermal comfort and pleasure, white-painted scenes for restoration and arousal, and concrete scenes, the lowest scores overall. Functional preferences varied by space: white-painted and concrete scenes were most preferred in study/work settings, wood in social spaces, wood and red brick in rest areas, and concrete in exhibition spaces. By isolating material variables in VR, this study offers a novel empirical approach and practical guidance for material selection in adaptive reuse to enhance user comfort, emotional well-being, and spatial functionality in industrial heritage renovations. Full article
(This article belongs to the Section Building Materials, and Repair & Renovation)
Show Figures

Figure 1

40 pages, 16352 KiB  
Review
Surface Protection Technologies for Earthen Sites in the 21st Century: Hotspots, Evolution, and Future Trends in Digitalization, Intelligence, and Sustainability
by Yingzhi Xiao, Yi Chen, Yuhao Huang and Yu Yan
Coatings 2025, 15(7), 855; https://doi.org/10.3390/coatings15070855 - 20 Jul 2025
Viewed by 718
Abstract
As vital material carriers of human civilization, earthen sites are experiencing continuous surface deterioration under the combined effects of weathering and anthropogenic damage. Traditional surface conservation techniques, due to their poor compatibility and limited reversibility, struggle to address the compound challenges of micro-scale [...] Read more.
As vital material carriers of human civilization, earthen sites are experiencing continuous surface deterioration under the combined effects of weathering and anthropogenic damage. Traditional surface conservation techniques, due to their poor compatibility and limited reversibility, struggle to address the compound challenges of micro-scale degradation and macro-scale deformation. With the deep integration of digital twin technology, spatial information technologies, intelligent systems, and sustainable concepts, earthen site surface conservation technologies are transitioning from single-point applications to multidimensional integration. However, challenges remain in terms of the insufficient systematization of technology integration and the absence of a comprehensive interdisciplinary theoretical framework. Based on the dual-core databases of Web of Science and Scopus, this study systematically reviews the technological evolution of surface conservation for earthen sites between 2000 and 2025. CiteSpace 6.2 R4 and VOSviewer 1.6 were used for bibliometric visualization analysis, which was innovatively combined with manual close reading of the key literature and GPT-assisted semantic mining (error rate < 5%) to efficiently identify core research themes and infer deeper trends. The results reveal the following: (1) technological evolution follows a three-stage trajectory—from early point-based monitoring technologies, such as remote sensing (RS) and the Global Positioning System (GPS), to spatial modeling technologies, such as light detection and ranging (LiDAR) and geographic information systems (GIS), and, finally, to today’s integrated intelligent monitoring systems based on multi-source fusion; (2) the key surface technology system comprises GIS-based spatial data management, high-precision modeling via LiDAR, 3D reconstruction using oblique photogrammetry, and building information modeling (BIM) for structural protection, while cutting-edge areas focus on digital twin (DT) and the Internet of Things (IoT) for intelligent monitoring, augmented reality (AR) for immersive visualization, and blockchain technologies for digital authentication; (3) future research is expected to integrate big data and cloud computing to enable multidimensional prediction of surface deterioration, while virtual reality (VR) will overcome spatial–temporal limitations and push conservation paradigms toward automation, intelligence, and sustainability. This study, grounded in the technological evolution of surface protection for earthen sites, constructs a triadic framework of “intelligent monitoring–technological integration–collaborative application,” revealing the integration needs between DT and VR for surface technologies. It provides methodological support for addressing current technical bottlenecks and lays the foundation for dynamic surface protection, solution optimization, and interdisciplinary collaboration. Full article
Show Figures

Graphical abstract

25 pages, 442 KiB  
Article
Beyond Books: Student Perspectives on Emerging Technologies, Usability, and Ethics in the Library of the Future
by Faisal Kalota, Benedicta Frema Boamah, Hesham Allam, Tyler Schisler and Grace Witty
Publications 2025, 13(3), 32; https://doi.org/10.3390/publications13030032 - 15 Jul 2025
Viewed by 411
Abstract
This research aims to understand the evolving role of academic libraries, focusing on student perceptions of current services and their vision for the future. Data was collected using a survey at a midwestern research university in the United States. The survey contained both [...] Read more.
This research aims to understand the evolving role of academic libraries, focusing on student perceptions of current services and their vision for the future. Data was collected using a survey at a midwestern research university in the United States. The survey contained both quantitative and qualitative questions. The objective of the survey was to understand the current utilization of library services and students’ future visions for academic libraries. Qualitative and quantitative analysis techniques were utilized as part of the study. Thematic analysis was employed as part of the qualitative analysis, while descriptive and inferential analysis techniques were utilized in the quantitative analysis. The findings reveal that many students use libraries for traditional functions such as studying and accessing resources. There is also an inclination toward digitalization due to convenience, accessibility, and environmental sustainability; however, print materials remain relevant as well. Another finding was a lack of awareness among some students regarding available library services, indicating a need for better marketing and communication strategies. Students envision future libraries as technology-driven spaces integrating artificial intelligence (AI), augmented reality (AR), virtual reality (VR), and innovative collaborative environments. Ethical considerations surrounding AI, including privacy, bias, and transparency, are crucial factors that must be addressed. Some of the actionable recommendations include integrating ethical AI, implementing digital literacy initiatives, conducting ongoing usability and user experience (UX) research within the library, and fostering cross-functional collaboration to enhance library services and student learning. Full article
Show Figures

Figure 1

27 pages, 2935 KiB  
Article
A Pilot Study on Emotional Equivalence Between VR and Real Spaces Using EEG and Heart Rate Variability
by Takato Kobayashi, Narumon Jadram, Shukuka Ninomiya, Kazuhiro Suzuki and Midori Sugaya
Sensors 2025, 25(13), 4097; https://doi.org/10.3390/s25134097 - 30 Jun 2025
Viewed by 594
Abstract
In recent years, the application of virtual reality (VR) for spatial evaluation has gained traction in the fields of architecture and interior design. However, for VR to serve as a viable substitute for real-world environments, it is essential that experiences within VR elicit [...] Read more.
In recent years, the application of virtual reality (VR) for spatial evaluation has gained traction in the fields of architecture and interior design. However, for VR to serve as a viable substitute for real-world environments, it is essential that experiences within VR elicit emotional responses comparable to those evoked by actual spaces. Despite this prerequisite, there remains a paucity of studies that objectively compare and evaluate the emotional responses elicited by VR and real-world environments. Consequently, it is not yet fully understood whether VR can reliably replicate the emotional experiences induced by physical spaces. This study aims to investigate the influence of presentation modality on emotional responses by comparing a VR space and a real-world space with identical designs. The comparison was conducted using both subjective evaluations (Semantic Differential method) and physiological indices (electroencephalography and heart rate variability). The results indicated that the real-world environment was associated with impressions of comfort and preference, whereas the VR environment evoked impressions characterized by heightened arousal. Additionally, elevated beta wave activity and increased beta/alpha ratios were observed in the VR condition, suggesting a state of high arousal, as further supported by positioning on the Emotion Map. Moreover, analysis of pNN50 revealed a transient increase in parasympathetic nervous activity during the VR experience. This study is positioned as a pilot investigation to explore physiological and emotional differences between VR and real spaces. Full article
Show Figures

Figure 1

13 pages, 3210 KiB  
Article
Bridging Tradition and Innovation: Transformative Educational Practices in Museums with AI and VR
by Michele Domenico Todino, Eliza Pitri, Argyro Fella, Antonia Michaelidou, Lucia Campitiello, Francesca Placanica, Stefano Di Tore and Maurizio Sibilio
Computers 2025, 14(7), 257; https://doi.org/10.3390/computers14070257 - 30 Jun 2025
Viewed by 947
Abstract
This paper explores the intersection of folk art, museums, and education in the 20th century, with a focus on the concept of art as experience, emphasizing the role of museums as active, inclusive learning spaces. A collaboration between the University of Salerno and [...] Read more.
This paper explores the intersection of folk art, museums, and education in the 20th century, with a focus on the concept of art as experience, emphasizing the role of museums as active, inclusive learning spaces. A collaboration between the University of Salerno and the University of Nicosia has developed virtual museum environments using virtual reality (VR) to enhance engagement with cultural heritage. These projects aim to make museums more accessible and interactive, with future potential in integrating artificial intelligence NPC and VR strategies for personalized visitor experiences of the Nicosia Folk Art Museum. Full article
Show Figures

Graphical abstract

26 pages, 8159 KiB  
Article
A Combined Mirror–EMG Robot-Assisted Therapy System for Lower Limb Rehabilitation
by Florin Covaciu, Bogdan Gherman, Calin Vaida, Adrian Pisla, Paul Tucan, Andrei Caprariu and Doina Pisla
Technologies 2025, 13(6), 227; https://doi.org/10.3390/technologies13060227 - 3 Jun 2025
Viewed by 2104
Abstract
This paper presents the development and initial evaluation of a novel protocol for robot-assisted lower limb rehabilitation. It integrates dual-modal patient interaction, employing mirror therapy and an auto-adaptive EMG-driven control system, designed to enhance lower limb rehabilitation in patients with hemiparesis impairments. The [...] Read more.
This paper presents the development and initial evaluation of a novel protocol for robot-assisted lower limb rehabilitation. It integrates dual-modal patient interaction, employing mirror therapy and an auto-adaptive EMG-driven control system, designed to enhance lower limb rehabilitation in patients with hemiparesis impairments. The system features a robotic platform specifically engineered for lower limb rehabilitation, which operates in conjunction with a virtual reality (VR) environment. This immersive environment comprises a digital twin of the robotic system alongside a human avatar representing the patient and a set of virtual targets to be reached by the patient. To implement mirror therapy, the proposed protocol utilizes a set of inertial sensors placed on the patient’s healthy limb to capture real-time motion data. The auto-adaptive protocol takes as input the EMG signals (if any) from sensors placed on the impaired limb and performs the required motions to reach the virtual targets in the VR application. By synchronizing the motions of the healthy limb with the digital twin in the VR space, the system aims to promote neuroplasticity, reduce pain perception, and encourage engagement in rehabilitation exercises. Initial laboratory trials demonstrate promising outcomes in terms of improved motor function and subject motivation. This research not only underscores the efficacy of integrating robotics and virtual reality in rehabilitation but also opens avenues for advanced personalized therapies in clinical settings. Future work will investigate the efficiency of the proposed solution using patients, thus demonstrating clinical usability, and explore the potential integration of additional feedback mechanisms to further enhance the therapeutic efficacy of the system. Full article
Show Figures

Figure 1

18 pages, 2855 KiB  
Article
Visual Environment Effects on Wayfinding in Underground Spaces
by Jupeng Wu and Soobeen Park
Buildings 2025, 15(11), 1918; https://doi.org/10.3390/buildings15111918 - 2 Jun 2025
Viewed by 421
Abstract
This study investigates how visual environmental factors influence wayfinding behavior in underground spaces, with a particular focus on cultural differences between Korean and Chinese college students. A virtual reality (VR) environment was developed using Unity3D to simulate an underground space, incorporating five key [...] Read more.
This study investigates how visual environmental factors influence wayfinding behavior in underground spaces, with a particular focus on cultural differences between Korean and Chinese college students. A virtual reality (VR) environment was developed using Unity3D to simulate an underground space, incorporating five key visual variables: passage width, brightness, color temperature (warm vs. cool), the presence or absence of obstacles, and the configuration of sign systems. Participants were divided into two groups—Korean (Group K) and Chinese (Group C)—and engaged in a VR-based wayfinding experiment followed by an emotional vocabulary evaluation. The results indicate significant cultural differences in spatial perception and navigation preferences. Chinese participants preferred narrower, brighter, and cool-colored passages, associating them with an improved sense of direction, lower stress, and enhanced attention. In contrast, Korean participants favored wider, darker, and warm-colored passages, valuing accessibility, stability, and distance perception. Both groups showed a strong preference for environments with floor signage and combined sign systems, though Korean participants were more tolerant of obstacles. These findings provide practical insights for designing more inclusive and navigable underground public spaces across different cultural contexts. Full article
(This article belongs to the Section Architectural Design, Urban Science, and Real Estate)
Show Figures

Figure 1

28 pages, 5598 KiB  
Article
Integrating Virtual Reality to Enhance Thermal Comfort in Educational Spaces: A Pilot Study Towards Sustainable Learning Environments
by Rund Hiyasat, Laurens Luyten and Lindita Bande
Sustainability 2025, 17(11), 5033; https://doi.org/10.3390/su17115033 - 30 May 2025
Cited by 1 | Viewed by 605
Abstract
This pilot study explores the use of Virtual Reality (VR) to enhance perceived thermal comfort (TC) within educational settings, where physical modifications to classrooms are often limited. As sustainability becomes a priority in building design, VR and Immersive Virtual Environments (IVE) offer an [...] Read more.
This pilot study explores the use of Virtual Reality (VR) to enhance perceived thermal comfort (TC) within educational settings, where physical modifications to classrooms are often limited. As sustainability becomes a priority in building design, VR and Immersive Virtual Environments (IVE) offer an innovative approach to optimizing user comfort without altering physical conditions. This study investigates how VR influences comfort perception through qualitative data collected from semi-structured interviews with four architecture students who attended academic presentations in a VR setting. Thematic analysis identified key factors affecting user experience, including visual satisfaction, physical discomfort, engagement, and perceived shifts in comfort parameters. Results indicate that VR environments featuring natural and calming visual elements can enhance perceived TC by reducing stress and increasing focus. However, challenges such as device discomfort, visual limitations, and distractions from non-task-related virtual elements were also noted. Findings highlight the importance of aligning visual elements with task requirements, optimizing brightness for engagement, and ensuring better connectivity between VR settings and real-world surroundings. As a pilot study, these findings provide preliminary insights into VR’s potential to support user comfort and engagement in student-centered learning environments, particularly in automated climate-controlled spaces with limited user control, laying the groundwork for future research. Full article
Show Figures

Figure 1

23 pages, 9051 KiB  
Article
Predicting User Attention States from Multimodal Eye–Hand Data in VR Selection Tasks
by Xiaoxi Du, Jinchun Wu, Xinyi Tang, Xiaolei Lv, Lesong Jia and Chengqi Xue
Electronics 2025, 14(10), 2052; https://doi.org/10.3390/electronics14102052 - 19 May 2025
Viewed by 785
Abstract
Virtual reality (VR) devices that integrate eye-tracking and hand-tracking technologies can capture users’ natural eye–hand data in real time within a three-dimensional virtual space, providing new opportunities to explore users’ attentional states during natural 3D interactions. This study aims to develop an attention-state [...] Read more.
Virtual reality (VR) devices that integrate eye-tracking and hand-tracking technologies can capture users’ natural eye–hand data in real time within a three-dimensional virtual space, providing new opportunities to explore users’ attentional states during natural 3D interactions. This study aims to develop an attention-state prediction model based on the multimodal fusion of eye and hand features, which distinguishes whether users primarily employ goal-directed attention or stimulus-driven attention during the execution of their intentions. In our experiment, we collected three types of data—eye movements, hand movements, and pupil changes—and instructed participants to complete a virtual button selection task. This setup allowed us to establish a binary ground truth label for attentional state during the execution of selection intentions for model training. To investigate the impact of different time windows on prediction performance, we designed eight time windows ranging from 0 to 4.0 s (in increments of 0.5 s) and compared the performance of eleven algorithms, including logistic regression, support vector machine, naïve Bayes, k-nearest neighbors, decision tree, linear discriminant analysis, random forest, AdaBoost, gradient boosting, XGBoost, and neural networks. The results indicate that, within the 3 s window, the gradient boosting model performed best, achieving a weighted F1-score of 0.8835 and an Accuracy of 0.8860. Furthermore, the analysis of feature importance demonstrated that the multimodal eye–hand features play a critical role in the prediction. Overall, this study introduces an innovative approach that integrates three types of multimodal eye–hand behavioral and physiological data within a virtual reality interaction context. This framework provides both theoretical and methodological support for predicting users’ attentional states within short time windows and contributes practical guidance for the design of attention-adaptive 3D interfaces. In addition, the proposed multimodal eye–hand data fusion framework also demonstrates potential applicability in other three-dimensional interaction domains, such as game experience optimization, rehabilitation training, and driver attention monitoring. Full article
Show Figures

Figure 1

20 pages, 6282 KiB  
Article
Neural-Network-Driven Intention Recognition for Enhanced Human–Robot Interaction: A Virtual-Reality-Driven Approach
by Ali Kamali Mohammadzadeh, Elnaz Alinezhad and Sara Masoud
Machines 2025, 13(5), 414; https://doi.org/10.3390/machines13050414 - 15 May 2025
Viewed by 860
Abstract
Intention recognition in Human–Robot Interaction (HRI) is critical for enabling robots to anticipate and respond to human actions effectively. This study explores the application of deep learning techniques for the classification of human intentions in HRI, utilizing data collected from Virtual Reality (VR) [...] Read more.
Intention recognition in Human–Robot Interaction (HRI) is critical for enabling robots to anticipate and respond to human actions effectively. This study explores the application of deep learning techniques for the classification of human intentions in HRI, utilizing data collected from Virtual Reality (VR) environments. By leveraging VR, a controlled and immersive space is created, where human behaviors can be closely monitored and recorded. Ensemble deep learning models, particularly Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM), and Transformers, are trained on this rich dataset to recognize and predict human intentions with high accuracy. While CNN and CNN-LSTM models yielded high accuracy rates, they encountered difficulties in accurately identifying certain intentions (e.g., standing and walking). In contrast, the CNN-Transformer model outshone its counterparts, achieving near-perfect precision, recall, and F1-scores. The proposed approach demonstrates the potential for enhancing HRI by providing robots with the ability to anticipate and act on human intentions in real time, leading to more intuitive and effective collaboration between humans and robots. Experimental results highlight the effectiveness of VR as a data collection tool and the promise of deep learning in advancing intention recognition in HRI. Full article
(This article belongs to the Section Advanced Manufacturing)
Show Figures

Figure 1

8 pages, 1425 KiB  
Proceeding Paper
Development of Educational System for Buddhism and Meditation Using Virtual Reality Technology
by Yuan-Chin Hsu, Ming-Feng Wang and Chen-Shih Lu
Eng. Proc. 2025, 92(1), 50; https://doi.org/10.3390/engproc2025092050 - 6 May 2025
Viewed by 588
Abstract
In Taiwan, professional training for Buddhist meditation demands significant time and space. These limitations reduce the effectiveness of learners’ practice. Therefore, metaverse technology was developed in this study to enable students to deeply engage with Buddhism and meditation and offer the opportunity to [...] Read more.
In Taiwan, professional training for Buddhist meditation demands significant time and space. These limitations reduce the effectiveness of learners’ practice. Therefore, metaverse technology was developed in this study to enable students to deeply engage with Buddhism and meditation and offer the opportunity to learn virtual reality (VR) technology. The developed system also guides them to develop a VR system for other meditation practices. The developed virtual space for Buddhist studies provides practitioners with an immersive environment for meditation. Through in-depth learning of Buddhist culture, students can develop creative meditation training models under the guidance of experienced practitioners. This system enables (1) the creation of a virtual space for Buddhist studies and meditation, (2) the provision of educational and training models for Buddhist practice, and (3) the ability to meditate through VR at any time and place. The system has proven applicability in VR environments for Buddhist culture, promoting cultural heritage through digital technology and metaverse technology. Full article
(This article belongs to the Proceedings of 2024 IEEE 6th Eurasia Conference on IoT, Communication and Engineering)
Show Figures

Figure 1

23 pages, 2858 KiB  
Article
Towards an Explicit Understanding of Network Governance in Urban Forestry Management: A Case Study of Portland (OR), USA
by Quadri Olatunbosun Taiwo and Vivek Shandas
Sustainability 2025, 17(9), 4028; https://doi.org/10.3390/su17094028 - 29 Apr 2025
Viewed by 753
Abstract
As the climate continues to warm, and municipal leaders look for cost-effective and timely approaches to urban sustainability, one increasingly sought-out approach is the use of tree canopy to cool neighborhoods. Despite widespread efforts to expand tree canopy in cities, an overwhelming body [...] Read more.
As the climate continues to warm, and municipal leaders look for cost-effective and timely approaches to urban sustainability, one increasingly sought-out approach is the use of tree canopy to cool neighborhoods. Despite widespread efforts to expand tree canopy in cities, an overwhelming body of evidence suggests that urban green space is declining across the U.S., yet little is known about the factors that propel these changes. Understanding the institutional and governance systems can help identify the opportunities for slow consistent declines. Using social network analysis (SNA) metrics, we examined stakeholder roles in power structures and decision-making processes within Portland, Oregon’s urban forest management. Our results reveal a highly decentralized urban forestry network (density = 0.0079), with weak cohesion (5.4%) among 162 stakeholders. Moving forward, while network governance may face obstacles from conflicting interests among community and interagency groups, transforming governance models at all levels will require developing periodic, collaborative urban forestry management plans to address nature-based planning challenges. These planning documents should strongly emphasize not only the prioritization of tree equity-related ordinances but also the optimization of eco-literacy and awareness through virtual reality (VR) technology. As a novel approach, immersive simulations demonstrate practical potential for showcasing urban forestry benefits in network governance outreach and consensus-building. Full article
Show Figures

Figure 1

22 pages, 4213 KiB  
Article
User Experience of Virtual Human and Immersive Virtual Reality Role-Playing in Psychological Testing and Assessment: A Case Study of ‘EmpathyVR’
by Sunny Thapa Magar, Haejung Suk and Teemu H. Laine
Sensors 2025, 25(9), 2719; https://doi.org/10.3390/s25092719 - 25 Apr 2025
Viewed by 707
Abstract
Recent immersive virtual reality (IVR) technologies provide users with an enhanced sense of spatial and social presence by integrating various modern technologies into virtual spaces and virtual humans (VHs). Researchers and practitioners in psychology are attempting to understand the psychological processes underlying human [...] Read more.
Recent immersive virtual reality (IVR) technologies provide users with an enhanced sense of spatial and social presence by integrating various modern technologies into virtual spaces and virtual humans (VHs). Researchers and practitioners in psychology are attempting to understand the psychological processes underlying human behavior by allowing users to engage in realistic experiences within illusions enabled by IVR technologies. This study examined the user experience of role-playing with VHs in the context of IVR-based psychological testing and assessment (PTA) with a focus on EmpathyVR, an IVR-based empathy-type assessment tool developed in an interdisciplinary project. This study aimed to evaluate the advantages and disadvantages of integrating IVR-based role-playing with VHs into PTA by examining user immersion, embodiment, and satisfaction. A mixed-method approach was used to collect data from 99 Korean adolescents. While the participants reported high levels of immersion and satisfaction, the sense of embodiment varied across the correspondents, suggesting that users may have had disparate experiences in terms of their connection to the virtual body. This study highlights the potential of IVR-based role-playing with VHs to enhance PTA, particularly in empathy-related assessments, while underscoring areas for improvement in user adaptation and VH realism. The results suggest that IVR experiences based on role-playing with VHs may be feasible for PTA to advance user experience and engagement. Full article
(This article belongs to the Special Issue Virtual Reality and Sensing Techniques for Human)
Show Figures

Figure 1

6 pages, 299 KiB  
Proceeding Paper
Three-Dimensional Creation and Physical Movement in Art Therapy Using Virtual Reality Painting
by Chia-Chieh Lee and Min-Chai Hsieh
Eng. Proc. 2025, 89(1), 46; https://doi.org/10.3390/engproc2025089046 - 17 Apr 2025
Viewed by 324
Abstract
Virtual Reality (VR) painting, an emerging form of artistic expression under 5G technology, showcases a broader range of expressive styles and dynamic visual effects compared to traditional computer graphics. The creative process in VR painting enhances spatial depth, exhibiting different spatial abilities and [...] Read more.
Virtual Reality (VR) painting, an emerging form of artistic expression under 5G technology, showcases a broader range of expressive styles and dynamic visual effects compared to traditional computer graphics. The creative process in VR painting enhances spatial depth, exhibiting different spatial abilities and necessitating more physical movements, including hand controllers and foot movements in the virtual space. Furthermore, VR painting in art therapy encourages users to engage in physical activities, contributing to better emotional expression. This study involved digital-native users in VR painting, using Meta Quest 2 to operate Open Brush for their creations. Through observational methods, we examined user operational behaviors and conducted semi-structured interviews post-experiment to explore their painting performance and usage behaviors in the virtual environment. The results of this study indicate that VR painting enhances the sense of space and dynamic expression in creative work and improves users’ emotional and physical engagement, providing new avenues for artistic expression. These findings contribute to improving the usability and application value of VR paintings. Full article
Show Figures

Figure 1

18 pages, 5020 KiB  
Article
Virtual Reality as a Tool for Enhancing Understanding of Tactical Urbanism
by Italo Seghetto, Ricardo Lopes and Fernando Lima
Architecture 2025, 5(2), 26; https://doi.org/10.3390/architecture5020026 - 15 Apr 2025
Viewed by 726
Abstract
Tactical urbanism (TU) and Virtual Reality (VR) both aim to reimagine physical spaces, with TU utilizing rapid, temporary, scalable, and cost-effective physical interventions to test and refine urban design, while VR offers immersive virtual environments for exploration and analysis. This article investigates the [...] Read more.
Tactical urbanism (TU) and Virtual Reality (VR) both aim to reimagine physical spaces, with TU utilizing rapid, temporary, scalable, and cost-effective physical interventions to test and refine urban design, while VR offers immersive virtual environments for exploration and analysis. This article investigates the integration of VR with TU to address challenges in effectively communicating and evaluating temporary urban interventions. This study is grounded in a literature review on spatial perception, TU, and VR, followed by an empirical experiment involving Brazilian college students. Participants interacted with a parklet installation in both physical and virtual environments, with their spatial perception and emotional responses evaluated using the AR4CUP (Augmented Reality for Collaborative Urban Planning) protocol. The results demonstrated that VR positively impacts the perception, usability, and social dynamics of urban spaces. Participants emphasized the importance of social interaction and recreational activities, reinforcing VR’s potential to simulate and refine urban interventions. A crucial avenue for future research is identifying best practices for using VR as a platform for collaborative design and decision-making. This step could enhance VR’s effectiveness in creating public spaces that align with community needs, fostering participatory planning and promoting inclusive, functional, and enriching environments. Full article
Show Figures

Figure 1

Back to TopTop