Next Article in Journal
Collision Avoidance Adaptive Data Rate Algorithm for LoRaWAN
Previous Article in Journal
A Systematic Review and Multifaceted Analysis of the Integration of Artificial Intelligence and Blockchain: Shaping the Future of Australian Higher Education
Previous Article in Special Issue
Virtual Reality in the Classroom: Transforming the Teaching of Electrical Circuits in the Digital Age
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Survey of the Real-Time Metaverse: Challenges and Opportunities †

by
Mohsen Hatami
1,
Qian Qu
1,
Yu Chen
1,*,
Hisham Kholidy
2,
Erik Blasch
3 and
Erika Ardiles-Cruz
3
1
Department of Electrical and Computer Engineering, Binghamton University, Binghamton, NY 13902, USA
2
Computer Science Department, Northern Illinois University, Dekalb, IL 60115, USA
3
Air Force Research Laboratory, Rome, NY 13441, USA
*
Author to whom correspondence should be addressed.
Approved for Public Release: Distribution Unlimited: Case Number AFRL-2024-5779.
Future Internet 2024, 16(10), 379; https://doi.org/10.3390/fi16100379
Submission received: 10 September 2024 / Revised: 12 October 2024 / Accepted: 16 October 2024 / Published: 18 October 2024

Abstract

:
The metaverse concept has been evolving from static, pre-rendered virtual environments to a new frontier: the real-time metaverse. This survey paper explores the emerging field of real-time metaverse technologies, which enable the continuous integration of dynamic, real-world data into immersive virtual environments. We examine the key technologies driving this evolution, including advanced sensor systems (LiDAR, radar, cameras), artificial intelligence (AI) models for data interpretation, fast data fusion algorithms, and edge computing with 5G networks for low-latency data transmission. This paper reveals how these technologies are orchestrated to achieve near-instantaneous synchronization between physical and virtual worlds, a defining characteristic that distinguishes the real-time metaverse from its traditional counterparts. The survey provides a comprehensive insight into the technical challenges and discusses solutions to realize responsive dynamic virtual environments. The potential applications and impact of real-time metaverse technologies across various fields are considered, including live entertainment, remote collaboration, dynamic simulations, and urban planning with digital twins. By synthesizing current research and identifying future directions, this survey provides a foundation for understanding and advancing the rapidly evolving landscape of real-time metaverse technologies, contributing to the growing body of knowledge on immersive digital experiences and setting the stage for further innovations in the Metaverse transformative field.

1. Introduction

The metaverse is a complex, multi-component, hierarchical construct integrating various technologies and systems to create an immersive, three-dimensional (3D) interconnected virtual universe [1]. Figure 1 illustrates a seven-layer architecture of the metaverse. At its foundation, the Infrastructure Layer is the technical backbone that ensures the metaverse operates smoothly and efficiently [2]. It includes the physical hardware like servers and data centers and the cloud computing resources that provide the necessary computational power. A robust infrastructure is essential for scalability, enabling the metaverse to grow and accommodate increasing users and experiences.
The Interface layer determines how users interact with the metaverse [3], which includes the devices they use, such as VR headsets, AR glasses, and smartphones, as well as the software interfaces that make the metaverse accessible and user-friendly [1]. A well-designed interface is key to ensuring that the metaverse is easy to navigate and engaging, making it accessible to a wide range of users, regardless of their technical expertise.
The core of the metaverse is the Decentralization layer, which ensures that the metaverse operates without being controlled by a single entity. The decentralization layer is crucial for maintaining user autonomy, privacy, and security, often achieved through blockchain technology [4]. By distributing power and data across a network of users, decentralization enables true ownership of digital assets, allowing participants to engage in transactions and interactions with confidence that their data and property are secure [5].
Spatial computing plays an essential role in a metaverse, which merges the physical and digital worlds to create immersive experiences. The computing layer utilizes technologies like virtual reality (VR) [6], augmented reality (AR) [7], mixed reality (MR) [8], and haptic feedback systems [9], allowing users to interact with digital objects as if they were part of the physical world. Spatial computing makes the metaverse more tangible and real, enabling users to experience and manipulate 3D environments in ways that go beyond the limitations of traditional computing interfaces [10].
The Creator Economy layer is the engine that drives innovation and content within the metaverse [11]. This layer supports the tools and platforms that allow users to create, distribute, and monetize digital content and experiences [12]. The creator economy fosters a vibrant, self-sustaining ecosystem where creativity is rewarded by empowering individuals to produce and profit from their creations [13]. The creator’s content fuels the diversity of experiences available and encourages continual growth and expansion of the metaverse.
The Discovery layer helps users navigate the vast expanse of the metaverse [14]. It includes search engines, social networks, and recommendation systems that guide users toward content, experiences, and services that match their interests [15]. Effective discovery mechanisms are essential for helping users find and engage with what they are looking for, ensuring that the metaverse remains a dynamic and accessible space.
Finally, the Experiences layer is where the real value of the metaverse is realized [3]. It encompasses all the activities users can participate in, from socializing and gaming to education and commerce. The quality and variety of these experiences make the metaverse an engaging and appealing place for users to spend their time. The experience layer is constantly evolving, driven by the creativity of the community and the opportunities enabled by the other layers.
The metaverse is driven by several key technologies that collectively create an immersive, interactive, and interconnected nature [16], as conceptually shown in Figure 2. Blockchain technology underpins the decentralization and security aspects of the metaverse [17]. Blockchain ensures that digital assets, including virtual currencies, property, and identities, are securely managed and owned by users without relying on centralized authorities [18]. Blockchain technology provides transparency, immutability, and trust, enabling users to engage in secure transactions and interactions within the metaverse [19]. By leveraging blockchain, the metaverse can maintain a fair and open economy where users have full control over their digital assets and data.
Augmented reality (AR) and virtual reality (VR) are at the forefront of creating immersive experiences within the metaverse [20]. VR creates fully digital environments that users can explore and interact with using devices like headsets and gloves, effectively transporting them to another world. AR, on the other hand, overlays digital content in the real world, enhancing users’ perceptions and interactions with their surroundings [21]. Together, AR and VR provide the sensory and spatial components that make the metaverse feel tangible and engaging, allowing users to interact with digital spaces in ways that mimic real-world experiences. 5G is essential for enabling the metaverse due to its ultra-high-speed connectivity and low latency [22]. This technology allows for seamless, real-time interactions within virtual environments. It supports high-bandwidth applications like augmented reality (AR), virtual reality (VR), and 3D reconstructions by providing faster data transfer, improving the user experience even in highly populated virtual spaces [23]. It also enables advanced features like haptic feedback in virtual reality, enhancing immersion by allowing users to “feel” virtual objects or textures.
Artificial intelligence (AI) technologies drive the intelligence and responsiveness of the metaverse [24]. AI powers various aspects of the metaverse, including creating realistic virtual characters to interact with users and generating dynamic and adaptive environments [25]. AI also plays a crucial role in personalization, learning from users’ behaviors and preferences to tailor experiences that meet individual needs. By enabling complex decision-making and learning within the metaverse, AI ensures that the digital world is not only immersive but also intelligent and responsive. Brain–computer interfaces (BCIs) allow users to interact with virtual environments through brain signals, bypassing traditional input methods like keyboards or controllers [26]. This technology, while still in its early stages, has the potential to revolutionize the metaverse by enabling thought-based commands and avatar control. Early applications in gaming and productivity could lead to a more immersive and efficient experience. More people may use BCIs, which connect directly to the human neocortex and allow for advanced cognitive functions and interaction in virtual spaces [27].
Internet of Things (IoT) technology bridges the gap between the physical and digital worlds by connecting real-world edge devices to the metaverse [28]. IoT technology allows physical objects, from home appliances to vehicles, to communicate and interact with the digital environment. IoT integration enables real-time data exchange and interaction, making the metaverse a more seamless extension of the physical world [2,29]. For instance, IoT can allow users to control real-world devices within a virtual environment or bring physical data into the metaverse for analysis and visualization. Similarly, 3D reconstruction technology is vital in creating detailed and accurate digital representations of real-world environments [30]. This technology captures and digitizes physical spaces, objects, and people, bringing them into the metaverse with high fidelity. Three-dimensional (3D) reconstruction allows for creating virtual replicas of real-world locations, enabling users to explore and interact with these spaces as if they were physically present [31]. Three-dimensional (3D) capabilities are important for applications such as virtual tourism, real estate, and architecture within the metaverse.
All these technologies, working together, form the backbone of the metaverse, enabling users to create, explore, and interact in a fully realized digital universe [2].
While there are many survey papers about the full concept of the metaverse, this paper focuses specifically on the new frontier of the real-time metaverse paradigm. This survey is the first to examine the emerging technological ecosystem comprehensively. We present the challenges and opportunities brought by creating genuinely responsive virtual environments, from the analysis of state-of-the-art enabling technologies to their synergistic integration. It reconciles metaverse ideas and live systems and provides a starting point for researchers, developers, and industry stakeholders interested in exploring the future of immersive digital experiences. We would like to see our survey help guide this large and complex area, close the reality gap further, and lead the way toward an integral use of dynamic, interactive virtual worlds that are seamlessly integrated with our physical world.
The rest of this paper is structured as follows. Section 2 presents a general overview of the real-time metaverse concept. Section 3 introduces critical enabling technologies. The state-of-the-art real-time metaverse is discussed in Section 4, followed by sections that address multiple key issues in the real-time metaverse. Section 5 outlines the technological infrastructure that supports real-time virtual experiences. Section 6 details the engines for immersive technologies. Section 7 tackles one of the critical but long-time missed components in the metaverse ecosystem, the standard. The challenges and opportunities are illustrated in Section 8. Finally, Section 9 concludes this survey with future directions.

2. The Real-Time Metaverse

The real-time metaverse is a cutting-edge component of the ongoing digital revolution [32], serving as an advanced expansion of the broader metaverse concept. The real-time metaverse represents the next evolution in virtual worlds, transforming static digital environments into dynamic, interactive spaces continuously updated based on real-world data. The real-time metaverse goes beyond the traditional metaverse’s pre-built, static worlds [33]—where users can socialize, work, play, and explore—by incorporating actual changes in the real world as they happen. This shift unlocks a new realm of possibilities, blending physical and digital realities to create more immersive and interactive experiences.
The core of the real-time metaverse instantaneously syncs real-world activities and environments with their digital counterparts. This is achieved through advanced sensor technologies and other inputs that continuously capture and send data from the physical world. These sensors gather critical information about an environment’s geometry, movement, and visual details, feeding it into powerful data processing systems [34]. Data fusion techniques combine these diverse data streams into a single coherent model [35], which is then rendered in real-time within the virtual space. The result is a virtual environment that evolves in response to real-world changes, whether it is the movement of objects, changes in lighting, or adding new elements to a scene.

2.1. Dynamic Environments and Real-Time Interaction

One of the key distinguishing features of the real-time metaverse is its dynamic nature. In a traditional metaverse, environments are typically static or periodically updated [21,36], meaning that users interact with a world that may not reflect ongoing changes in the physical world. This makes the experience somewhat disconnected from reality. On the other hand, the real-time metaverse is designed to be constantly evolving [37]. As sensors capture real-world changes—such as the construction of a new building, the movement of vehicles, or even weather conditions—those changes are reflected instantly in the virtual environment [38]. This dynamic interaction opens up a wide range of applications. For example, a virtual replica of a city can be continuously updated to reflect real-time traffic patterns [39,40], construction projects [41], or even local events. Users could navigate this city virtually and interact with it as they would in the physical world. The virtual ability to mirror real-world changes in real-time has far-reaching implications for industries such as urban planning [42], architecture [43], education [44], and entertainment [45].

2.2. Enhanced Immersion and Engagement

The real-time metaverse offers a profound immersion compared to its traditional counterpart of independent systems. The user experience becomes more engaging and lifelike with the seamless integration of real-world data. Imagine attending a virtual concert that mirrors a live performance in a physical venue, with sound, lighting, and audience reactions captured in real-time [46]. This blurring of boundaries between the real and the virtual world creates a sense of presence and immediacy that is difficult to achieve in static environments. Furthermore, the real-time metaverse enhances the interactivity of virtual experiences [47]. Users are no longer just passive participants in a pre-rendered environment but can actively interact with objects and spaces that reflect real-world conditions. For example, in a real-time virtual meeting space, participants could interact with digital versions of objects [48] that are being manipulated in the physical world. The increased level of interaction could revolutionize fields like remote collaboration, virtual workspaces, and education, where the ability to manipulate and experience real-time objects and data is crucial.

2.3. The Role of Advanced Technology

The realization of the real-time metaverse hinges on several key technological advancements. One of the most important is the ability to capture, transmit, and process data in real-time. This requires a high-speed, low-latency infrastructure, such as 5G networks and edge computing, which allows sensor data to be processed locally or close to the source, reducing transmission delays [49]. Additionally, powerful AI models are necessary to analyze and interpret the incoming data, ensuring it is accurately reflected in the virtual environment. AI also plays a crucial role in managing the complexity of real-time environments [50]. Machine learning algorithms can help predict how objects will move or change over time, making the real-time metaverse more fluid and reducing the impact of any delays in data transmission. Furthermore, AI-driven data fusion techniques allow the system to combine data from multiple sensors [51] into a unified 3D model.

2.4. Applications and Potential Impact

As we approach the beginning of the digital transformation, the implications of the real-time metaverse extend far beyond mere entertainment. The potential applications of the real-time metaverse are vast and diverse. In terms of entertainment, live events such as sports games or concerts [52] could be experienced virtually in real-time, allowing users to engage with the event from anywhere in the world. In urban planning and architecture, creating a real-time virtual replica of a city can help planners and architects test designs and simulate the impact of changes [53] before they are made in the physical world. Education can also benefit from real-time virtual classrooms that mirror real-world laboratories or simulations [54], giving students hands-on experience in science, engineering, and medicine. The real-time metaverse also has the potential to transform social interactions [33]. Instead of static avatars meeting in a fixed environment, people could interact dynamically with each other in spaces that reflect their real-world surroundings, allowing for more meaningful and immersive connections. For example, a person in one part of the world could invite a friend to virtually experience their real-world environment in real-time, whether a stroll through a park or a live museum tour.
Figure 3 illustrates a hierarchical system for the real-time metaverse, consisting of three layers. At the physical layer, real-world actions and data are captured through sensors and devices in various domains, such as robotics, medical procedures, and fitness tracking. The data are transmitted via 5G networks to the technology layer, where cloud and high-performance computing (HPC) infrastructure processes and synchronizes it in real-time. The top metaverse layer represents digital twin environments, where the processed data are rendered into virtual replicas of physical actions, enabling immersive real-time interactions. This system ensures that real-world activities are mirrored seamlessly in the virtual world, supporting real-time metaverse experiences.

3. Key Enabling Technologies: An Overview

The metaverse represents a burgeoning frontier where the digital and physical worlds converge, offering immersive experiences through advancements in virtual reality (VR), augmented reality (AR), and other cutting-edge technologies. This section delves into the critical aspects of the metaverse, including blockchain integration, artificial intelligence (AI), edge computing, and the myriad challenges and implications of these innovations.
Blockchain technology is foundational for establishing a decentralized, secure, and interoperable metaverse [55]. Blockchain enhances metaverse functionalities such as data acquisition, storage, sharing, interoperability, and privacy preservation. Blockchain ensures the trustworthiness of transactions and interactions by providing a transparent and tamper-proof ledger, essential for managing digital assets and identities in a virtual world [1]. Blockchain can facilitate the creation of interoperable virtual worlds, allowing users to seamlessly transition and securely interact across different platforms, which is crucial for realizing a unified metaverse [56].
In the metaverse, AI is pivotal in enhancing user experiences by enabling more natural and intuitive interactions [55]. AI applications such as natural language processing, computer vision, and neural interfaces are instrumental in creating responsive and adaptive virtual environments [57]. AI-driven avatars, capable of understanding and reacting to user inputs, significantly enhance the realism and immersion of the metaverse [58]. Edge computing complements AI by providing computational power and low-latency communication for real-time interactions in the metaverse [59].
The deployment of the metaverse faces several technological and infrastructural challenges [21]. The role of 5G/6G technology in overcoming these hurdles lies in its ability to provide ultra-low latency, high data rates, and enhanced reliability. The proposed layered architecture for integrating 6G with the Metaverse addresses the need for scalable and efficient network infrastructure to handle the vast amounts of data generated by Metaverse applications. Network scalability, data privacy, and security are identified as key challenges [32]. The advanced cryptographic algorithms and robust security protocols protect user data and ensure secure transactions within the metaverse are required [60]. Developing interoperability standards is also critical to facilitate seamless interactions across different virtual environments [61].
Edge computing complements VR and AI by providing the computational power and low-latency communication necessary for real-time interactions in the metaverse. Some research indicates that seamless real-time feedback ensures user safety and effectiveness. Edge computing is essential for the Metaverse, particularly in enhancing the performance and scalability of virtual environments. The role of edge-enabled technologies in managing the real-time demands of VR and AR applications is crucial. By processing data closer to the user, edge computing reduces latency and improves the overall user experience [36]. In addition, integrating advanced queue management algorithms with edge computing, as some research shows, can optimize network performance for data-intensive applications [62]. Network integration is paramount for maintaining the high levels of responsiveness and interaction fidelity required by the metaverse. By processing data closer to the user, edge computing minimizes latency. It ensures high-quality user experiences, particularly in applications that demand immediate feedback and minimal delays, such as health, education, and interactive entertainment.
VR and AR are core technologies driving the immersive experiences of the metaverse. Integrating haptic feedback in AR shows how tactile sensations can enhance user interactions in virtual environments [9,63]. Innovations such as the FingerTac haptic gloves provide real-time haptic feedback and improve the realism of AR applications [9]. The infrastructure supporting the metaverse must handle compute- and data-intensive tasks efficiently. Research has demonstrated that the application of VR combined with treadmill training can improve balance and mobility in individuals with traumatic brain injury (TBI) [64]. The integration of VR in rehabilitation highlights its potential in the metaverse, where real-time feedback and dynamic interaction are key. The study showed that participants engaged more with the VR-assisted training, reporting improved balance and mobility [65]. This aligns with the metaverse’s vision of creating adaptive and immersive spaces customized to individual user needs, whether for entertainment, social interaction, or healthcare.
The above research underscores VR’s transformative potential, especially in healthcare applications within the metaverse [66]. By offering real-time sensory feedback and stimulating neural pathways, as seen in neurorehabilitation, VR in the metaverse can provide personalized and therapeutic environments for users [67]. Therapeutic applications can have significant implications, particularly for industries such as healthcare, where virtual environments could be used for rehabilitation [68], therapy [69], and patient care [70]. Additionally, by enabling real-time interaction and multisensory engagement, VR enhances the immersive potential of the metaverse applications, making them more impactful and effective. The study on virtual taste and smell technologies further expands the sensory dimensions of VR, which is essential for a more immersive metaverse experience [71]. By integrating taste and smell, the metaverse can replicate real-world environments with a higher level of fidelity, crucial for applications in sectors such as education [72], gaming [72], and virtual tourism [73]. These technologies can enrich user experiences, offering deeper emotional and cognitive connections to the virtual world, which could be used in leisure and therapeutic contexts.
Developing multisensory technologies, such as taste and smell in VR, is essential for expanding the metaverse’s immersive capabilities [74,75]. Incorporating sensory feedback beyond visual and auditory stimuli could transform user experiences across various applications, from entertainment to education and healthcare [76]. By enhancing the multisensory engagement, the metaverse can simulate real-world interactions more effectively, providing users with a richer, more connected virtual experience. Moreover, research into VR’s potential in therapeutic settings showcases the broader applications of immersive virtual worlds. The metaverse could provide virtual spaces where patients receive specialized, personalized treatment plans, leveraging VR’s ability to simulate realistic environments and offer real-time sensory feedback [77].
Security and privacy are among the top concerns in the metaverse, where vast amounts of personal data and digital assets are exchanged [78]. The immersive and persistent nature of the metaverse magnifies these concerns, as users’ real-world identities, behaviors, and biometric data, such as eye movements or facial expressions, may be tracked and stored [21]. The importance of advanced cryptographic algorithms and robust security protocols to protect user data and ensure secure transactions is a fundamental requirement [79]. Also, zero-knowledge proofs and homomorphic encryption are receiving more attention as possible ways to keep private user data safe while still allowing transactions to happen without trust [80,81].
Some research explains that the decentralized nature of the metaverse offers a layer of security by eliminating the need for a central authority [17,82]. However, decentralization also presents challenges, particularly in maintaining data integrity and preventing unauthorized access. Without a centralized body to enforce security standards, the responsibility for safeguarding personal data is often distributed across numerous stakeholders. This opens up the system to vulnerabilities, where malicious actors could gain control over decentralized networks. Non-fungible tokens (NFTs) are becoming more popular and important to the metaverse economy. However, there are risks of digital asset theft, fake NFTs, and fraud, so strong identity verification systems and anti-fraud mechanisms are needed.
The findings showed that the quality of user experience is another critical factor in the success of the metaverse [83]. The immersive nature of VR and AR technologies can significantly enhance user engagement, but they also come with challenges. Integrating haptic feedback in AR can improve the realism of virtual interactions [9]. However, the technical difficulties in delivering consistent and intuitive haptic feedback are due to network latency and resource allocation issues [84]. AI can enhance user interaction by making virtual environments more responsive and adaptive. AI-driven avatars, capable of understanding and reacting to user inputs, can create more engaging and personalized experiences. However, ensuring these AI systems operate seamlessly across different platforms and devices remains a significant challenge.
Some research has highlighted that scalability is another major challenge in the development of the metaverse [85]. The current social VR platforms struggle to support large numbers of concurrent users. The bandwidth requirements for transmitting high-resolution 3D content and real-time interactions can be immense, necessitating advanced networking techniques to ensure scalability. The potential of 5G, and 6G technologies is to address these scalability issues by providing ultra-low latency, high data rates, and enhanced reliability. The proposed layered architecture for integrating 6G with the metaverse could help manage the massive data traffic and support large-scale user interactions. However, deploying such infrastructure poses significant technical and economic challenges. Recently, researchers proposed Microverse [16], a task-oriented, scaled-down metaverse instance, as a practical approach to current technologies [86,87].
According to some studies, data integration is essential for creating seamless and coherent virtual experiences in the metaverse [21,88]. The role of the edge is to compute and manage the real-time demands of data integration by processing data closer to the user. This approach reduces latency and improves the quality of experience. However, ensuring consistent and accurate data synchronization across distributed edge nodes is complex. Integrating diverse data sources, including IoT devices, digital twins, and AI systems, into the metaverse is essential. Integrating devices enables the creation of rich and dynamic virtual environments but requires robust frameworks to manage data interoperability and consistency.
In addition, data compatibility is another major challenge in the interoperability of different systems within the metaverse [88]. The lack of standardized data formats and protocols can hinder seamless interactions between virtual environments. Ensuring data compatibility involves developing common standards and protocols that can be adopted across various platforms and technologies. Cross-platform compatibility is essential for allowing users to access the metaverse from different devices and platforms [89]. The main challenge is achieving cross-platform compatibility, particularly in delivering consistent user experiences across devices with varying capabilities. Ensuring that applications and content are accessible and functional on different platforms requires significant effort in standardization and optimization.
Interoperability is another area of research in the metaverse, enabling seamless transitions and interactions across different virtual worlds [90]. Ensuring a cohesive experience of interoperability in the metaverse is crucial. AI and machine learning can facilitate interoperability by enabling systems to understand and adapt to different environments [91]. However, achieving true interoperability requires collaboration between developers, platform providers, and standardization bodies to establish common frameworks and protocols. To enable seamless interactions, it is required to integrate diverse technologies, such as blockchain, AI, and edge computing, and the need for standardized interfaces to enable seamless interactions [19]. Without interoperability, the metaverse risks becoming fragmented, with isolated virtual environments that cannot communicate with each other.
The metaverse holds significant potential for transforming education by creating immersive and interactive learning environments [92]. The metaverse can enable virtual classrooms where students and teachers interact in a shared virtual space, enhancing the learning experience through interactive simulations and collaborative projects. The flexibility and accessibility of the metaverse can democratize education, making high-quality learning resources available to a global audience. The concept of distance online learning in the metaverse can be expanded through evidence-based insights by leveraging immersive technologies like social virtual reality environments (SVREs) and AR. Research highlights that SVREs foster deep and meaningful learning (DML) by enabling collaborative, authentic interactions among learners [93]. These virtual environments allow students to engage in social and cognitive activities that parallel in-person experiences, thus addressing traditional distance learning challenges like isolation and disengagement [94]. AR and VR technologies further support DML by creating simulations that encourage active, reflective, and goal-oriented learning, which enhances knowledge retention and motivation. Moreover, studies indicate that integrating SVREs into distance learning can foster a strong sense of presence and co-presence, leading to richer educational experiences that replicate the dynamics of physical classrooms [93].
Beyond technological advancements, the metaverse presents several social and ethical considerations [95]. Some issues, such as user addiction, digital harassment, and equitable representation of avatars, are highlighted. The immersive nature of the metaverse can exacerbate these issues, necessitating the development of guidelines and regulations to protect users and promote a healthy virtual environment [96]. Additionally, integrating AI and blockchain raises data privacy and security concerns, which must be addressed to ensure user trust and safety in the metaverse.

4. State-of-the-Art Real-Time Metaverse

4.1. Integration of the Physical and Virtual Worlds

Figure 4 presents a detailed view of the architecture for integrating physical, virtual, and real-time metaverse layers within a digital ecosystem. This image presents a conceptual overview of the physical–virtual world ecosystem, highlighting the synchronization of the physical and virtual environments. The Physical layer comprises four main components: users, IoT/sensors, virtual service providers, and physical service providers. Users engage with the virtual world using various devices, and IoT/sensors facilitate the synchronization between the physical and virtual worlds [15,97]. Virtual service providers offer digital goods and services. In contrast, physical service providers handle tangible goods and services transactions, highlighting the importance of interaction and synchronization between physical and virtual entities for a seamless user experience.
The Virtual layer includes avatars for virtual navigation, virtual environments for constructing the virtual world, and virtual goods/services such as virtual workspace and education [21]. The virtual layer bridges the gap between the physical and virtual worlds by providing immersive and interactive experiences. At its core, the real-time metaverse leverages data from IoT devices, sensors, and service providers to continuously update the virtual world in alignment with physical events [98]. The ability to “bring the physical world to the virtual” lies in deploying advanced sensors. These sensors capture critical information about real-world environments, such as the spatial layout, temperature, and motion of objects [99]. The data are then transmitted to cloud processing centers via high-speed communication technologies, enabling the real-time updating of virtual spaces [100]. Once the data are collected, they undergo data fusion—a process in which inputs from different sensors are combined to create a single, cohesive representation of the physical environment in the virtual world [101,102]. Fusing these diverse data types creates a rich, detailed, and accurate 3D model of the physical environment in the virtual world [103]. The data are processed in the cloud, allowing real-time synchronization between the two realms.

4.2. The Role of the Metaverse Engine

The Metaverse engine is crucial for maintaining the intelligence and real-time nature of the virtual environment, as shown in Figure 4. This engine incorporates a variety of advanced technologies [104], including VR/AR, haptic feedback, digital twin (DT), AI, blockchain, mixed reality [8], and advanced graphical rendering [105]. These features allow users to interact with the digital world in a way that feels natural and immersive. The metaverse engine also ensures that the virtual environment responds dynamically to user inputs and real-world changes. For example, in an industrial setting, IoT sensors embedded in machinery could transmit live data about the status and operation of physical assets to the metaverse [106]. The engine would interpret this information and update the virtual workspace to reflect real-time changes in equipment status, such as a machine overheating or requiring maintenance. AI-driven systems further enhance the realism of the environment by predicting how objects might behave in the future, ensuring the virtual world is not only reactive but also proactive [107].
AI models play a critical role in maintaining the intelligence of the real-time metaverse [108]. These models are responsible for analyzing sensor data, detecting patterns, and automating processes within the virtual environment [24]. For example, natural language processing (NLP) enables seamless human–human communication in virtual spaces, while image generation models can create realistic textures and visuals from raw data. The Metaverse engine further personalizes the user experience by suggesting virtual environments, services, or goods based on real-time preferences and interactions.

4.3. Communication and Computational Infrastructure

For the real-time metaverse to function efficiently, robust infrastructure is necessary. Figure 4 highlights key infrastructure components—communication, computation, and storage—that support the real-time data flow between the physical and virtual worlds. Scalability is essential, as the system must handle ultra-dense connectivity with high data rate reliability and low-latency communication. Technologies like 5G and edge computing are pivotal in ensuring that real-time data can be collected and transmitted without delays [109]. These communication technologies facilitate real-time data transmission and control signals between sensors, robots, and user interfaces, ensuring low latency and high reliability [110]. Real-time connectivity is essential for maintaining the interactive nature of the real-time metaverse, allowing users to experience immediate responses to their actions.
In addition to scalability, the infrastructure must be ubiquitous [37] and shardless [111]. Trustworthiness [80] is another critical factor in real-time metaverse systems, as users must trust the data and services they interact with. Localized computing, with cloud-edge-assisted rendering, cloud-edge AI model training, and blockchain mining are essential for delivering real-time experiences at scale, whether users are in urban areas with high connectivity or remote locations with limited infrastructure [90]. Technologies like blockchain provide decentralized storage, ensuring secure and verifiable transactions within the metaverse, from virtual goods to real-world e-commerce logistics.

4.4. Data Fusion and Real-Time Interaction

Figure 5 elaborates on the technical process of data fusion within the real-time metaverse. A variety of sensors, such as radar [112], LiDAR [113], RGB cameras [114], multispectral sensors [115], and hyperspectral sensors [25,116], collect data. With the high-speed connectivity of advanced communication systems, these data are transmitted to the cloud, where they are fed into the data fusion process [117,118]. For instance, LiDAR provides depth and spatial information, while RGB cameras contribute visual details like color and texture [119]. Thermal cameras capture temperature differences, and multispectral or hyperspectral sensors gather data across different wavelengths, offering unique insights into material properties [120]. This fusion of different sensor streams enables the creation of an integrated and high-fidelity virtual model that accurately reflects the physical world [121]. The fused data are then transmitted to robots and haptic control systems [122], enabling both physical and virtual objects to interact in real-time. For instance, an avatar in the metaverse could control a robot in the real world, with haptic feedback systems giving the user real-world sensations based on the robot’s actions. Haptic controls play a vital role in ensuring that users can feel and interact with virtual objects as if they were physical [76]. This real-time feedback system enables greater immersion and interaction, bridging the gap between the virtual and real worlds. In combination with AI, these systems allow for complex interactions between physical and virtual entities, whether it is for remote collaboration, education, or e-commerce applications.
Cloud-edge computing, as depicted, ensures that data processing occurs as close as possible to the source, reducing the latency involved in transferring data to central servers [123]. Data and commands are processed in centralized servers or cloud-based processing centers. This setup supports complex computations and large-scale data storage, enabling the system to handle vast amounts of information and deliver real-time responses. The system benefits from scalable processing power and robust data management capabilities by leveraging cloud integration.

5. Technological Infrastructure

The technological infrastructure of the metaverse comprises a sophisticated and interconnected ecosystem of advanced technologies designed to support immersive, real-time virtual experiences [90]. The hardware includes high-speed and low-latency networks like 5G and fiber optics, powerful computing hardware such as graphics processing units (GPUs) and cloud computing resources, and robust data storage solutions [124]. Together, these components ensure the metaverse is a seamless, scalable, and interactive digital universe capable of supporting various applications from gaming and social interactions to professional and educational environments. Key enablers include high-performance computing, cloud and fog architectures, and edge devices as shown in Figure 6.

5.1. High-Performance Computing (HPC)

High-Performance computing (HPC) leverages the combined power of supercomputers, cluster centers, and parallel processing techniques to tackle complex computational problems beyond standard desktop computers’ capabilities [125]. HPC systems are essential in various domains, such as scientific research, design engineering, and data analysis, enabling large-scale simulations, systems modeling, and processing of massive datasets. Supercomputers and HPC clusters of interconnected nodes perform calculations at incredible speeds by dividing tasks into smaller sub-problems solved simultaneously. Advanced data centers equipped with high-performance servers and GPUs are essential for processing the vast amounts of data required for real-time metaverse interactions [105]. Specialized software and tools, like MPI (message passing interface), OpenMP (open multi-processing), and CUDA (compute unified device architecture), support the development and execution of HPC applications, ensuring efficient use of the vast computing resources available [126]. Applications range from climate modeling and molecular dynamics in scientific research to computational fluid dynamics and structural analysis in engineering, as well as big data analytics and machine learning in data-intensive fields.
Despite the immense capabilities of HPC for a real-time metaverse, it faces several challenges, including scalability, energy consumption, cost, and complexity [127]. Scaling applications across thousands of processors efficiently requires optimizing code to minimize communication overhead. The significant power consumption of supercomputers and large clusters necessitates the development of energy-efficient designs. The high costs of HPC infrastructure and the specialized knowledge required for developing and maintaining applications further complicate its adoption. However, the future of HPC is promising, with advancements in exascale computing, quantum computing, and AI integration. Exascale systems will enable even more complex simulations and data analyses, while quantum computing could revolutionize fields like cryptography and material science [128]. Combining HPC with AI and machine learning will drive innovations across various domains, and research into energy-efficient technologies aims to reduce the environmental impact of HPC.

5.2. Cloud Computing

Cloud computing is a transformative technology that allows individuals and organizations to access and store data, applications, and computing power over the internet rather than relying on local servers or personal devices [129]. Cloud technology offers several key advantages, including scalability, flexibility, cost-efficiency, and accessibility [130]. Cloud services are typically categorized into three main types: infrastructure as a service (IaaS) [131], which provides virtualized computing resources over the internet; platform-as-a-service (PaaS) [132], which offers hardware and software tools for application development; and software as a service (SaaS) [133], which delivers software applications over the internet on a subscription basis [134]. Major cloud service providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) offer robust solutions that cater to a wide range of needs, from startups requiring minimal resources to enterprises needing extensive infrastructure and advanced services.
Cloud computing has revolutionized various industries by enabling more efficient and innovative business models. For instance, in the healthcare sector, cloud computing facilitates the secure storage and sharing of patient data, supports telehealth services, and enhances collaborative research through data analytics [135]. In finance, it allows for real-time transaction processing and advanced fraud detection. Additionally, cloud computing supports the growing field of remote work by providing seamless access to applications and data from any location, fostering collaboration and productivity [136]. Despite its many benefits, cloud computing also poses challenges such as data security, privacy concerns, and dependency on internet connectivity. However, ongoing advancements in cloud security protocols and hybrid cloud solutions, which combine private and public cloud resources, are addressing these issues and enhancing the reliability and security of cloud services.

5.3. Edge and Fog Computing

Edge and fog computing are paradigms that enhance data processing and analysis capabilities closer to the source of data generation, reducing latency and improving efficiency [137]. Edge computing involves processing data directly on devices or near the data source, such as sensors, IoT devices, or local servers [138]. The edge approach minimizes the need to send data to centralized cloud servers, reducing latency and bandwidth usage. Edge computing applications include real-time analytics, autonomous vehicles, and smart cities, where immediate data processing is crucial. For instance, in autonomous cars, edge computing allows for rapid decision-making based on real-time data from sensors, enhancing safety and performance.
Fog computing, on the other hand, extends the concept of edge computing by providing a distributed computing infrastructure that includes edge devices, local servers, and potentially the cloud [139]. The fog architecture acts as an intermediary layer that processes data before it reaches the cloud, providing additional storage and computational resources closer to the data source [140]. The fog–edge layered approach benefits applications requiring real-time processing and more substantial computational power or data aggregation. Fog computing is beneficial in scenarios like industrial IoT, where data from numerous devices needs to be aggregated and analyzed swiftly to optimize operations and maintenance. By distributing resources across multiple layers, fog computing improves overall system efficiency, scalability, and reliability.
Figure 6 illustrates a comprehensive architecture that integrates cloud computing, fog computing, and edge computing. At the center of the diagram is the cloud, symbolizing the centralized and extensive data processing capabilities of cloud computing. Cloud computing is depicted with multiple connections, including data centers and satellites, highlighting its broad reach and ability to handle significant computational tasks and data storage. Servers and network infrastructure placed closer to the data sources, such as IoT devices and sensors, represent fog computing. The fog layer aims to reduce latency by processing data near its origin before sending it to the cloud. Edge computing, depicted with smaller, decentralized servers and network nodes, is even closer to the end-users and devices. It emphasizes real-time data processing and immediate response actions, crucial for applications requiring low latency and high reliability. Integrating these computing paradigms ensures a balanced and efficient data handling from the core cloud to the network’s edge, optimizing performance and resource utilization.

5.4. 5G Communication Technology

Fifth-generation (5G) communication technology represents a significant leap in mobile communications, offering faster data speeds, lower latency, and more excellent connectivity compared to previous generations [141].
With theoretical speeds of up to 10 Gbps and latency as low as 1 millisecond, 5G enables a wide range of applications that require real-time data transmission and high bandwidth. These include enhanced mobile broadband, ultra-reliable low-latency communications (URLLC), and massive machine-type communications (mMTC) [142]. Industries such as healthcare, automotive, and entertainment are poised to benefit immensely from 5G. For example, 5G supports telemedicine and remote surgeries in healthcare by providing reliable, high-speed connections necessary for transmitting high-definition video and large medical data files in real-time.
Beyond 5G, the focus is on developing technologies like 6G, which aims to provide even higher speeds, lower latency, and more extensive connectivity [59]. 6G is expected to integrate advanced technologies such as artificial intelligence, machine learning, and blockchain to enhance network management, security, and efficiency. Potential applications of 6G include holographic communications, advanced AR and VR, and ubiquitous IoT connectivity, enabling smart environments and autonomous systems to operate seamlessly [143]. Research and development in 6G technology are exploring new spectrum bands, such as terahertz frequencies, to achieve these ambitious goals. As society moves toward 6G, the focus will also be on sustainable and energy-efficient solutions to support the ever-growing demand for data and connectivity.
Figure 7 shows a 5G network architecture, integrating various components and technologies to create a robust communication system. The core is a massive multiple-input multiple-output (MIMO) network that connects to resources and wireless and wired links [144]. The cloud provides extensive data processing and storage capabilities, connected to the core network and servers via wired links. The internet is crucial, linking the core network and servers to broader online resources. The mobile small cell network, including 5G-enabled devices, ensures coverage and connectivity for mobile users. The edge network includes sensors, IoT devices, and connected appliances, receiving and sending data through wireless links. A residential setup is also included, demonstrating the network’s ability to provide high-speed internet access to homes.

5.5. Storage Technology

Storage technology plays a critical role in the real-time metaverse by ensuring fast, secure, and efficient data handling for continuous immersive experiences [145]. As the metaverse involves interactions between physical and virtual environments, data from various sources such as sensors, edge devices, and AI systems must be cached and retrieved in real-time [146]. Technologies like macro/small base station caching [147] and device-to-device caching [148] help reduce latency by storing frequently accessed data closer to the user, ensuring faster retrieval times. This is essential in high-demand environments where real-time responsiveness directly impacts the user experience.
In addition to reducing latency, efficient storage mechanisms like edge AI model caching contribute to the real-time metaverse’s ability to manage large-scale, complex interactions without overwhelming the network [149]. Edge AI model caching allows real-time AI computations to occur closer to the user by caching AI models at edge nodes, which improves performance and reduces the load on central servers [150]. This approach enhances scalability and ensures that the system can support a high number of simultaneous interactions without performance degradation, a necessity in the immersive and constantly evolving world of the real-time metaverse.
Furthermore, optimal cache replacement is crucial in ensuring that the most relevant and frequently accessed data remains readily available, while outdated or less important data are efficiently replaced [151]. As the real-time metaverse generates and processes massive amounts of data, intelligent cache management prevents storage bottlenecks and ensures the availability of real-time data [152]. This ensures seamless transitions and interactions within the real-time metaverse, allowing users to experience smooth, uninterrupted virtual environments while maintaining the overall system’s trustworthiness and performance.

6. Metaverse Engine

The metaverse engine comprises seven key components: immersive technologies (including VR/AR, haptic feedback, mixed reality, and advanced graphical rendering), digital twin, AI, and blockchain. Each of these components will be detailed in the following sections.

6.1. Immersive Technologies

Immersive technologies for the real-time metaverse encompass a range of advanced tools and systems designed to create deeply engaging and interactive virtual experiences [105,153,154]. These technologies include VR, AR, mixed reality (MR) [155], haptic feedback [9], and advanced graphical rendering techniques [25], all of which work together to blur the lines between the physical and digital worlds. Figure 8 showcases the integration of VR/AR, haptic, and advanced graphical rendering technologies to enhance virtual and augmented reality experiences. VR/AR creates immersive environments using physics, animations, and auditory information. Haptic enhances human-machine communication and provides tactile feedback for real-life interactions. Advanced graphical rendering uses AI-driven techniques to generate realistic environments and enhance graphics, resulting in more immersive and interactive virtual experiences.

6.1.1. Virtual Reality (VR)

Virtual reality (VR) is a technology that immerses users in a computer-generated environment, providing a simulated experience that can be similar to or completely different from the real-world [156]. VR typically involves headsets equipped with displays, sensors, and controllers that track the user’s movements and interactions within the virtual environment [157]. VR technology is widely used in gaming and entertainment, creating immersive experiences that engage users in new and exciting ways [158]. Beyond entertainment, VR has significant applications in education, where it enables interactive learning experiences, such as virtual field trips or simulations of complex scientific concepts, allowing students to explore and understand subjects more deeply.
In addition to its impact on gaming and education, VR is transforming industries like healthcare, real estate, and training [159]. In healthcare, VR is used for surgical simulations, allowing surgeons to practice procedures in a risk-free environment and for therapeutic purposes, such as exposure therapy for patients with anxiety disorders. In real estate, VR provides virtual tours of properties, giving potential buyers a realistic sense of space without needing to visit in person [160]. In professional training, VR offers a safe and controlled environment for employees to practice skills and scenarios, such as emergency response or complex machinery operation [161]. As VR technology advances, with improvements in display resolution, motion tracking, and user interfaces, its applications are expected to expand further, offering increasingly sophisticated and practical uses across various fields.

6.1.2. Augmented Reality (AR)

Augmented reality (AR) is a technology that overlays digital information and virtual objects onto the real world, enhancing the user’s perception and interaction with their environment [162]. Unlike virtual reality, which creates an entirely simulated experience, AR blends virtual elements with the physical world, often through smartphones, tablets, or AR glasses [163]. AR technology has gained widespread popularity in applications such as mobile gaming, with games like Pokémon GO allowing users to interact with virtual characters in real-world locations [164]. AR also enhances navigation and location-based services by providing real-time information and directions overlaid on the physical environment, improving user convenience and engagement [165].
Beyond entertainment and navigation, AR is making significant strides in many fields, such as retail, education [166] (distance online learning), and healthcare [167]. In retail, AR applications allow customers to visualize products in their own space before purchasing [168], such as seeing how furniture would look in their home or trying on virtual clothing [169]. AR not only enhances the shopping experience but also reduces return rates and increases customer satisfaction. In education, AR brings learning materials to life by enabling interactive and immersive experiences, such as 3D visualizations of historical events or scientific phenomena, which can deepen student understanding and engagement [54]. In healthcare, AR assists surgeons by providing real-time overlays of critical information during procedures, improving precision and outcomes [170]. As AR technology continues to evolve, its integration into everyday life and various professional fields is expected to grow, offering increasingly innovative and practical applications.

6.1.3. Mixed Reality (MR)

Mixed reality (MR) is an advanced technology that seamlessly blends the physical and digital worlds, creating environments where real and virtual elements coexist and interact in real-time [171]. Unlike VR, which immerses users in a completely virtual environment, or AR, which overlays digital information in the real world, MR allows for more complex interactions between physical and virtual objects. MR is typically achieved using advanced sensors, cameras, and displays, often incorporated into headsets like Microsoft’s HoloLens or Magic Leap. These devices track the user’s position and surroundings, enabling virtual objects to be anchored in the real world and interact with them naturally and intuitively [172].
The potential applications of mixed reality span numerous fields, enhancing productivity, creativity, and learning [8]. MR can be used in industry and manufacturing for remote collaboration, allowing engineers to visualize and manipulate 3D models of equipment or structures as if they were physically present [23]. This can improve design accuracy and speed up problem-solving processes. MR can provide immersive learning experiences in education, such as virtual laboratories where students can conduct experiments without the risk or expense associated with physical setups. MR can assist in medical training in the healthcare sector by simulating complex surgical procedures with real-time feedback and guidance. As MR technology continues to evolve [173], it promises to revolutionize how we interact with both the digital and physical worlds, providing a more integrated and interactive experience.

6.1.4. Haptic Feedback

Haptic feedback is a technology that simulates the sense of touch by applying forces, vibrations, or motions to the user [9,63]. Haptic sensory feedback enhances the immersive experience in virtual environments by allowing users to feel interactions with virtual objects as if they were real. In the context of the metaverse, haptic feedback is delivered through various devices such as gloves, vests, and other wearables, which can replicate sensations like texture, resistance, and impact. For instance, when a user picks up a virtual object or feels a virtual breeze, haptic devices provide physical sensations corresponding to those actions, making the virtual experience more realistic and engaging.
Integrating haptic feedback into the metaverse has significant implications for various applications, including gaming, training, and remote collaboration [174]. In gaming, haptic feedback enhances immersion by allowing players to physically feel in-game actions, such as holding equipment or surface texture. In training simulations, particularly in fields like medicine and engineering, haptic technology can provide realistic practice scenarios, helping users develop practical skills without real-world risks. For remote collaboration, haptic feedback can bridge the gap between digital and physical interactions, enabling more effective and intuitive communication. Overall, haptic feedback enriches the user experience in the metaverse by adding a critical layer of sensory interaction that deepens engagement and realism.

6.1.5. Advanced Graphical Rendering Technologies

Advanced graphical rendering (GR) technologies are crucial for creating highly realistic and visually stunning virtual environments in the metaverse [25]. One of the key advancements in GR is real-time ray tracing, a technique that simulates the way light interacts with objects in a scene to produce highly accurate reflections, refractions, and shadows [105,175]. This level of detail enhances the realism of virtual worlds, making them more immersive and visually engaging. Real-time ray tracing requires significant computational power, but recent advances in GPUs and optimization techniques have made it feasible for real-time applications, allowing users to experience lifelike visuals in interactive settings such as gaming, virtual tours, and social interactions within the metaverse.
Another significant development in advanced graphical rendering is using artificial intelligence (AI) to enhance graphics quality and performance [24,108]. AI-driven techniques, such as deep learning-based upscaling (NVIDIA’s deep learning super sampling (DLSS)), can improve frame rates and image quality by predicting and generating high-resolution frames from lower-resolution inputs. High resolution not only ensures smoother performance but also allows for more detailed and complex scenes without compromising on speed. AI is also used in procedural content generation, enabling the creation of vast and diverse virtual landscapes with minimal manual effort [176]. By leveraging these advanced rendering technologies, the metaverse can offer visually rich, dynamic, and interactive environments that push the boundaries of what is possible in digital experiences, making them more compelling and lifelike for users.

6.2. Digital Twin

The integration of digital twin (DT) technology into the metaverse is transformative, serving as a crucial bridge between the physical and virtual worlds [59]. By creating a real-time digital replica of physical entities, DTs enable an immersive and interactive metaverse experience. This constant synchronization ensures that changes in the physical world are instantly reflected in the virtual environment, allowing users to engage with highly accurate and dynamic digital representations of objects, environments, and even people [177]. In this way, DTs elevate the realism and functionality of the metaverse, making it a more practical and engaging platform for users.
One of the most significant challenges in this integration is achieving real-time synchronization between the physical and digital realms [75]. Digital twins must continually update as real-world changes occur, ensuring that the metaverse remains a faithful and current representation [36]. By distributing the metaverse into smaller, localized sub-metaverses, these systems can process data closer to the source, reducing delays and enhancing the responsiveness of the digital twin environment [178]. This decentralized approach is particularly valuable in applications like autonomous vehicles [179], healthcare [180], and smart cities [181], where real-time accuracy is critical. In smart city ecosystems, the combination of DTs and the metaverse offers new dimensions for urban management and citizen interaction. Digital twins can mirror real-time city infrastructures, providing an immersive way for users to interact with urban spaces as though they are physically present. This not only improves user experience but also enhances urban planning, resource management, and disaster response through real-time simulations and predictive modeling [182]. As cities grow more complex, the ability to manage them in real-time through a metaverse-driven interface becomes increasingly valuable, offering more efficient ways to allocate resources and address challenges.
In sectors like healthcare and manufacturing, DTs integrated into the metaverse offer powerful tools for decision-making and efficiency [183]. For instance, in healthcare, digital twins can monitor patient health in real-time, allowing healthcare providers to predict potential issues and intervene before conditions worsen [180]. Similarly, in manufacturing, DTs simulate production processes, helping companies optimize performance, predict equipment failures, and reduce operational downtime [184]. By enabling these real-time simulations, the metaverse enhances decision-making capabilities and operational efficiency in critical industries. Security and trust are also essential aspects of the metaverse, where DTs play a pivotal role [185]. The decentralized nature of the metaverse, combined with blockchain technology, offers a secure and transparent framework for transactions and interactions. Blockchain ensures that all transactions are immutable, decentralized, and verifiable, making it ideal for safeguarding the integrity of digital twins and their data within the metaverse [186]. This decentralized trust mechanism is crucial as the metaverse becomes more integrated into everyday life, supporting interactions that are both secure and reliable.

6.3. Artificial Intelligence (AI) and Machine Learning (ML)

Artificial intelligence (AI) content generation plays a pivotal role in developing and enhancing the metaverse, a virtual universe where users can interact with each other and in digital environments in real-time [24]. Using AI to make content in a metaverse includes a lot of different technologies, such as procedural generation, natural language processing (NLP) [187], and large language models (LLMs) [188]. These technologies collectively create more immersive, interactive, and personalized user experiences. Procedural generation is a method in which content is created algorithmically rather than manually, allowing for the creation of vast and diverse virtual worlds within the metaverse. The procedural technique can generate everything from complex landscapes to intricate architecture and even entire ecosystems. Procedural generation ensures that the metaverse remains dynamic and expansive, offering users new and unique experiences each time they log in. The ability to automatically generate content also significantly reduces the time and resources needed for manual creation, enabling developers to focus on other aspects of the metaverse.
NLP is another critical component in the AI content generation for the metaverse [187]. NLP enables more natural and intuitive interactions between users and virtual entities, including non-player characters (NPCs) and digital assistants [24]. By understanding and processing human language, NLP allows these virtual entities to respond appropriately to user inputs, facilitating more engaging and meaningful conversations. The NLP capability fosters a seamless and immersive user experience [189], as it bridges the gap between human communication and digital interaction. LLMs, such as GPT-4, further enhance the capabilities of NLP in the metaverse [188].
These models are trained on vast datasets and can generate human-like text based on contextual understanding. LLMs can create complex narratives, generate dialogue for NPCs, and even assist in real-time translation between users of different languages. The integration of LLMs into the metaverse ensures that the virtual world is rich in content and can adapt to the diverse linguistic needs of its global user base. The combination of procedural generation, NLP, and LLMs in the metaverse leads to a highly dynamic and personalized user experience. For instance, users can explore unique environments tailored to their preferences, engage in meaningful conversations with virtual entities, and enjoy evolving narratives based on their interactions [190]. The advanced level of personalization not only enhances user engagement but also fosters a sense of connection and immersion within the virtual world. The AI-driven content generation ensures that the metaverse remains a vibrant and evolving space, continuously offering new experiences. In virtual environments such as the metaverse, AI-driven avatars and non-player characters (NPCs) serve distinct roles, each with unique functionalities and purposes. Although they both use artificial intelligence to power them, their approaches to user interaction, levels of autonomy, and overall goals in virtual spaces are fundamentally different.

6.3.1. AI-Driven Avatars

AI-driven avatars represent a remarkable fusion of artificial intelligence and digital animation, poised to revolutionize how we interact in virtual environments [108]. These avatars are sophisticated digital representations of individuals designed to mimic human behavior, appearance, and communication with high realism and responsiveness. At the core of these avatars is advanced AI technology, enabling them to learn, adapt, and respond to users naturally and intuitively [191]. The AI-driven avatars are integral in enhancing the user experience within the metaverse, providing a seamless and immersive interaction that bridges the gap between humans and machines. One of the most significant aspects of AI-driven avatars is their ability to understand and process natural language [192]. Using natural language processing (NLP) algorithms, these avatars can interpret spoken or written language, allowing for fluid and dynamic conversations with users. Machine learning models further enhance this capability, enabling avatars to learn from interactions and improve their responses over time. As a result, users can engage in meaningful and personalized dialogues with their avatars, making virtual interactions more engaging and lifelike.
In addition to linguistic capabilities, AI-driven avatars are equipped with advanced facial recognition and emotion detection technologies [193]. These technologies allow avatars to read and respond to human emotions, adapting their behavior and expressions accordingly. For instance, an avatar can recognize when a user is happy, sad, or frustrated and tailor its responses to provide appropriate emotional support or feedback. This emotional intelligence adds another layer of depth to virtual interactions, making AI-driven avatars not just functional tools but empathetic companions in the digital world. The applications of AI-driven avatars extend far beyond entertainment and social interactions. These avatars can serve as personalized tutors in educational settings, offering tailored instruction and feedback to students [194]. They can provide efficient and empathetic support in customer service, handling inquiries, and resolving issues with a human touch [195]. In healthcare, AI-driven avatars can act as virtual companions for patients, providing comfort and monitoring their well-being [196]. By integrating AI-driven avatars into various sectors, the metaverse can unlock new possibilities for enhancing user experiences and improving the quality of services across multiple domains.

6.3.2. Non-Player Characters (NPCs)

NPCs are fully autonomous characters embedded in the virtual environment, designed to perform roles independent of any human control. NPCs are generally pre-programmed with specific behaviors or controlled by the system’s AI to serve particular functions [197]. For example, in a gaming context, NPCs often populate the environment as background characters or antagonists. However, in educational and professional settings, NPCs are increasingly being used as interactive tools to enhance user experiences. NPCs in the educational metaverse can take on roles such as virtual tutors, peers, or advisors [198]. These NPCs interact with users to foster engagement, provide feedback, or guide them through learning tasks.
NPCs are particularly useful in education because they can simulate real-world scenarios, offering dynamic, role-based interactions [199]. For instance, an NPC designed as a tutor can offer students advice on solving problems or guide them through complex design thinking processes. NPCs can also be peers or students, enabling users to practice teaching, mentoring, or collaboration skills in a controlled, simulated environment. This creates a scalable and adaptable learning space, where NPCs act as constant participants, available for interaction at any time, regardless of the availability of human peers or instructors [200]. Furthermore, NPCs often leverage advanced natural language processing (NLP) capabilities, enabling them to carry out conversations, understand user queries, and provide meaningful responses in a way that feels interactive and intuitive.

6.3.3. Differences in Purpose and Autonomy

The primary distinction between AI-driven avatars and NPCs lies in their relationship with users and their level of autonomy. AI-driven avatars are designed to represent and act on behalf of the user, making decisions based on user-defined parameters or learned behaviors [201]. They are personalized to the user’s preferences and can be seen as digital extensions of the user’s identity within the metaverse. These avatars help maintain a user’s presence, even when they are not actively engaged in the virtual world, by taking over routine tasks or social interactions. On the other hand, NPCs exist independently of the user [202]. They are part of the virtual environment itself, managed by the system’s AI to provide users with a more immersive and engaging experience. NPCs follow predefined rules or machine learning algorithms, interacting with users to fulfill specific roles—whether as educators, fellow learners, or virtual assistants. While they can simulate human-like behaviors, NPCs do not represent any specific user and are instead designed to enhance the user’s experience by populating the world with interactive, responsive characters. NPCs facilitated design thinking by providing feedback, engaging in empathy-building exercises, and offering diverse perspectives that helped students reframe problems and develop solutions [203].
Another key difference between AI-driven avatars and NPCs is the degree of personalization [201]. AI-driven avatars are highly personalized to their user. They adapt and evolve based on the user’s behavior, preferences, and interactions, becoming more reflective of the user over time. This level of personalization ensures that the avatar represents the user’s unique identity and can act in ways that align with the user’s goals and needs. For example, in a corporate setting, an AI avatar could autonomously schedule meetings, manage tasks, or even negotiate with other AI avatars based on the user’s work habits and objectives. NPCs, in contrast, are designed to serve broader, more generalized purposes within the virtual environment. They are not tied to any specific user and often follow universal rules or patterns set by the system. While NPCs can adapt their behavior based on interactions with multiple users, their core function is to enhance the immersive experience for all users rather than reflecting the behavior of any individual [204]. For example, an NPC in a classroom scenario might adapt to the learning pace of different students, offering tailored guidance and feedback, but its role remains fundamentally as a system-controlled character that enriches the educational experience for everyone, not just a single user.

6.4. Blockchain

Blockchain technology is emerging as a foundational component for the realization of a secure, decentralized, and efficient real-time metaverse [205]. In a digital environment where the virtual and physical worlds merge, blockchain offers several critical benefits that ensure the smooth functioning of this complex ecosystem. By enabling decentralized control, enhancing data security, and ensuring trust, blockchain significantly contributes to the metaverse’s infrastructure, making it not only more reliable but also scalable for widespread use [5]. One of the primary advantages of blockchain in the metaverse is its ability to provide a decentralized architecture [206]. Unlike traditional systems that rely on centralized servers, blockchain allows data and transactions to be distributed across a network of nodes, reducing the risk of single points of failure. In the metaverse, this decentralization ensures that no single entity has complete control over user data or interactions [207]. This is crucial for fostering a sense of trust and transparency in an environment where users engage in social, economic, and virtual activities that require accountability and fairness.
Security is another major factor that blockchain brings to the metaverse [208]. Given the vast amount of personal data, assets, and transactions occurring within this space, ensuring the integrity and security of these activities is essential. Blockchain’s immutable ledger ensures that once a transaction is recorded, it cannot be altered or tampered with, providing a high level of security [209]. This is particularly important for protecting digital assets like virtual property, NFTs (non-fungible tokens), and cryptocurrencies, which are becoming integral parts of the metaverse economy. Blockchain’s encryption and consensus mechanisms prevent fraud, unauthorized access, and cyberattacks, making the metaverse a safer environment for its users. In addition to decentralization and security, blockchain also plays a critical role in establishing trust and ownership within the metaverse. Smart contracts, which are self-executing contracts with the terms directly written into code, allow for transparent, automated transactions between users without the need for intermediaries [55]. This creates a system where users can buy, sell, and trade virtual goods or services with confidence, knowing that the transaction is secure and verifiable. Furthermore, blockchain’s ability to record and authenticate digital ownership ensures that users can prove ownership of their virtual assets, from in-game items to virtual real estate, further solidifying the importance of blockchain in creating a functioning digital economy within the metaverse.

7. Interoperability Standards

As the real-time metaverse evolves, the need for seamless integration across various virtual worlds, platforms, and hardware has become increasingly apparent. The concept of interoperability—the ability for systems, platforms, and devices to work together harmoniously—lies at the heart of this vision. These standards ensure that different systems, devices, and applications can work together seamlessly, exchanging data and functionality without compatibility issues [55,88]. For users to experience a cohesive and immersive metaverse, regardless of the device or platform they are using, interoperability is key. Interoperability is particularly important in industries where disparate systems [210,211], such as healthcare, finance, telecommunications, and information technology, need to interconnect. By adhering to common protocols and formats, interoperability standards enable diverse systems to interoperate, reducing the need for costly custom integrations and minimizing the risk of data silos.
Figure 9 illustrates the concept of metaverse interoperability, depicting how users in the physical world can access a unified virtual world through various devices. The physical layer showcases the interoperability across these devices, allowing seamless transitions into the virtual world. Users can interact and navigate across various platforms and activities in the virtual layer once they are in the avatar-represented virtual world.

7.1. Standards Development Organizations

Standards development organizations (SDOs) are key players in the advancement of the metaverse [212], ensuring that different platforms, devices, and content within this evolving virtual ecosystem are interoperable, accessible, and scalable. These organizations develop open standards to promote uniformity and collaboration across industries [88], preventing fragmentation and proprietary limitations. The W3C (World Wide Web Consortium) has been instrumental in creating WebVR and WebXR [213], standards that enable immersive VR and AR experiences through web browsers. This makes content more accessible by eliminating the need for standalone applications. Similarly, the Web3D Consortium works on the X3D standard, which enables the sharing and use of 3D graphics across web platforms, crucial for delivering 3D content in the metaverse.
The ISO/IEC MPEG group has a longstanding role in developing standards for coding, compression, and transmission of audio and video content [214]. Their MPEG-V standard ensures smooth interoperability of immersive media, such as 3D and VR/AR content, across various devices. Meanwhile, the Khronos Group, through standards like OpenXR and glTF, is focused on providing open and royalty-free interoperability standards for XR devices and 3D content, helping developers create cross-platform applications [215]. The International Telecommunication Union Telecommunication Standardization Sector (ITU-T) Correspondence Group (CG)-metaverse is dedicated to the standardization of metaverse technologies, with a focus on enabling global interoperability through telecommunication and infrastructure standards [216]. The Open Metaverse Interoperability Group complements this effort by promoting open-source interoperability in the metaverse, advocating for user-friendly transitions between virtual worlds.
IEEE’s Metaverse Standards Committee, through its P2048 standards, is developing frameworks for VR/AR technologies on a global scale, ensuring technical coherence and best practices [212]. Similarly, the World Metaverse Council provides guidance on global policies and standards [217], focusing on ethical and regulatory frameworks for the metaverse. The Universal Scene Description (USD) standard improves 3D content interoperability for creation and collaboration in virtual environments [218]. Pixar, Adobe, and other companies lead the Alliance for OpenUSD [88]. Lastly, the Metaverse Standards Forum [96], which includes over 1,200 organizations, promotes open standards for the metaverse, driving industry-wide collaboration to create a more inclusive and interoperable virtual ecosystem. These SDOs collectively shape the future of the metaverse, ensuring it remains accessible, scalable, and technologically coherent across platforms, benefiting users and developers alike.

7.2. The Importance of Interoperability in the Real-Time Metaverse

In a real-time metaverse, users and assets must be able to move seamlessly across different virtual environments and platforms [219]. For instance, a user should be able to switch from a VR meeting application to an AR-enhanced virtual shopping experience, all while maintaining consistent access to their avatar, digital assets, and interactive capabilities [220]. The same applies to the integration of XR (extended reality) hardware, including virtual reality (VR) headsets, augmented reality (AR) glasses, and mixed reality (MR) systems. Suppose these devices are built on incompatible standards. In that case, users are likely to experience significant barriers to interaction, including the need for different software or restrictions on which platforms they can access [161]. Two primary open standards that seek to solve these problems are OpenXR and WebXR [221]. These standards, developed and promoted by organizations, represent an important step toward achieving greater interoperability in the metaverse. However, the road to widespread adoption and true standardization is still long and fraught with challenges.
Open standards play a crucial role in achieving interoperability. These publicly accessible specifications are the result of consensus-driven processes that frequently involve numerous stakeholders from business, academia, and government [222]. Open standards are designed to be vendor-neutral, ensuring that any organization can implement them without restrictive licensing terms. Open infrastructure promotes competition and innovation, as developers can build interoperable products without being locked into proprietary technologies. Examples of open standards include HTTP and HTML for web technologies, TCP/IP for internet communications [223], and Health Level 7 (HL7) for healthcare data exchange [224].
The adoption of open standards offers several significant advantages. First, it enhances compatibility and integration, allowing different systems to communicate and share data more easily [225]. Data compatibility is particularly beneficial in complex environments such as smart cities, where various technologies must work together to deliver seamless services. Second, open standards support long-term sustainability and flexibility [226]. Since they are not tied to a single vendor, organizations can avoid vendor lock-in and more easily adapt to changing technological landscapes. Third, open standards foster a collaborative ecosystem where communities of developers and organizations can contribute to and benefit from shared advancements and improvements [227].

7.3. OpenXR

OpenXR is an open standard for VR and AR that aims to streamline the development of applications across different hardware platforms [228]. Khronos Group, a consortium of industry-leading companies, has developed OpenXR, which provides a unified framework for VR and AR runtime interfaces. This allows developers to write code that can run on various devices without tailoring their applications to each specific platform. OpenXR provides a set of standardized APIs (application programming interfaces) that sit between XR applications and the underlying hardware, effectively decoupling software from hardware [221]. By abstracting the hardware details, OpenXR significantly reduces the complexity and cost of developing cross-platform VR and AR experiences, fostering more significant innovation and broader adoption in the immersive technology space.
The benefits of OpenXR extend beyond simplifying development. For end users, it ensures a more consistent and reliable experience across different VR and AR devices [229]. Since applications built with OpenXR can operate on multiple platforms, users are not limited to specific hardware when accessing their favorite VR and AR content. This interoperability also encourages competition among hardware manufacturers, as it levels the playing field and allows new entrants to support a rich content ecosystem without the barrier of proprietary software constraints. Ultimately, OpenXR plays a crucial role in the growth of the VR and AR industries by promoting a more open, inclusive, and efficient development environment.
By abstracting the hardware details, OpenXR significantly reduces the complexity and cost of developing cross-platform VR and AR experiences, fostering greater innovation and broader adoption in the immersive technology space [230]. This streamlined approach enables developers to spend more time focusing on the content and user experience, rather than worrying about hardware compatibility and performance optimization across different platforms. For example, OpenXR enables developers to use the same application on Meta Quest, which runs on a mobile chipset, as they would on a high-end PC VR system, with OpenXR taking care of the specific hardware adaptations. In addition to simplifying development, OpenXR offers substantial benefits for end users. It ensures a more consistent and reliable experience across different VR and AR devices [231]. This is especially important as the hardware landscape for XR continues to diversify. With the rise of new devices, from standalone headsets like Meta Quest 3 to AR glasses from companies like Magic Leap and Apple [232], users have more options than ever before. Thanks to OpenXR, users are no longer restricted to specific hardware ecosystems when accessing their favorite VR and AR content. For example, a user who purchases an app for their Oculus Quest 3 can seamlessly transition to using the same app on Valve’s Index VR or Windows Mixed Reality without losing any functionality or performance, as long as both platforms support OpenXR.
This interoperability also encourages competition among hardware manufacturers [233], as it levels the playing field and allows new entrants to support a rich content ecosystem without the barrier of proprietary software constraints. Companies are incentivized to innovate in hardware design and performance rather than relying on walled gardens to retain users. For instance, smaller or newer VR/AR hardware companies that support OpenXR can leverage the existing library of OpenXR applications, thereby reducing the friction involved in building their ecosystem from scratch. Moreover, OpenXR promotes accessibility and inclusivity in the development and user communities [234]. For smaller development studios or individual creators, the costs of developing multiple versions of a VR/AR application for different platforms can be prohibitive. OpenXR mitigates this by offering a single codebase that can be deployed across devices, enabling these developers to reach more users without the financial burden of platform-specific development.

7.4. WebXR

WebXR is a robust application programming interface (API) that brings VR and AR experiences to the web, enabling developers to create immersive content accessible directly through web browsers [235]. WebXR is a standardized framework for integrating VR and AR capabilities into web applications, run by the World Wide Web Consortium (W3C). This eliminates the need for users to download specialized software or applications, allowing them to experience immersive environments simply by navigating to a webpage. As WebXR takes advantage of the widespread use of web browsers to deliver content to a large audience across a variety of devices, from desktops and laptops to smartphones and specialized VR headsets, accessibility is essential for expanding the reach of VR and AR technologies.
The W3C developed WebXR to extend the capabilities of its predecessor, WebVR, which focused exclusively on virtual reality content [213]. WebXR, by contrast, is designed to support both VR and AR, reflecting the convergence of immersive technologies under the umbrella of XR (extended reality). By building on existing web standards, such as WebGL (for rendering 3D graphics) and WebRTC (for real-time communication), WebXR enables the seamless integration of immersive experiences into the web without requiring users to install additional plugins or extensions. This shift to a more plug-and-play model significantly reduces the friction for users, making VR and AR content as easily accessible as traditional multimedia like videos or interactive maps.
In addition to accessibility, WebXR offers significant development efficiencies. By adopting WebXR, developers can create cross-platform VR and AR content without needing to maintain separate codebases for different devices or platforms [236]. WebXR provides a unified API that works across various browsers and devices, making it easier to create device-agnostic immersive experiences. This allows developers to focus on the content and interaction design rather than worrying about compatibility issues across different hardware. Google Chrome, Firefox Reality, and Microsoft Edge fully support WebXR, enabling immersive experiences across a broad range of devices, from Oculus Quest to Microsoft HoloLens, as well as traditional desktop and mobile environments.
The increasing demand for web-based XR applications across various industries has also fueled the adoption of WebXR [237]. In e-commerce, for instance, WebXR enables AR product visualizations, allowing users to view 3D models of products in their real-world environment before making a purchase. WebXR facilitates the rapid development of such applications, as it allows businesses to embed AR experiences directly into their e-commerce sites without requiring users to install a separate app. Education and training are other sectors benefiting from WebXR [221]. With the rise of remote learning and virtual classrooms, WebXR enables delivering immersive educational content through standard browsers. Schools and universities can deploy virtual lab simulations, 3D models, and interactive lessons directly on their websites, accessible from any device. This is particularly useful in low-resource environments where schools may not have access to high-end VR hardware but can still provide immersive learning experiences using WebXR on desktop or mobile devices.
Healthcare is another industry that stands to benefit significantly from WebXR’s accessibility [238]. Medical professionals can use web-based VR simulations to practice procedures or conduct remote AR-assisted diagnostics. For example, WebXR-enabled telemedicine platforms could allow healthcare providers to remotely guide patients through diagnostic steps or deliver real-time AR overlays during consultations. By making such tools available via the web, WebXR expands access to critical healthcare services, particularly in remote or under-served regions where specialized hardware and software may not be readily available. In terms of future development, WebXR is expected to play a central role in the creation of the real-time metaverse. As the metaverse concept gains traction, the ability to access immersive virtual worlds via a web browser will be crucial for ensuring that the metaverse is open, accessible, and not restricted to proprietary platforms [239]. With WebXR, users could seamlessly enter and interact with the metaverse from any device, without needing specialized software or hardware. This aligns with the vision of a decentralized and open metaverse, where content and experiences are freely accessible and not locked behind walled gardens or expensive hardware ecosystems.

8. Challenges and Opportunities

8.1. Latency and Bandwidth

Latency and bandwidth are critical factors in the performance and user experience of the metaverse [240]. Latency refers to the delay between a user’s action and the system’s response, which is crucial for real-time interactions in virtual environments. High latency can result in lag, making movements and communications appear delayed or out of sync, thus disrupting the immersive experience [241]. Bandwidth, on the other hand, refers to the amount of data that can be transmitted over a network in a given period. Sufficient bandwidth is necessary to support the high data transfer rates required for high-resolution graphics, real-time interactions, and seamless streaming of virtual content [242]. Together, low latency and high bandwidth ensure smooth, responsive, and immersive experiences in the metaverse, allowing users to interact in real-time without interruptions or delays. Figure 10 illustrates the critical relationship between bandwidth and latency for various applications [243], focusing on a metaverse system.

8.1.1. Network Latency

Network latency refers to the delay between a user’s action and the response from the network in a digital environment [244]. In the context of the metaverse, which relies heavily on real-time interactions and seamless connectivity, latency can significantly impact user experience. High latency can cause noticeable delays in data transmission, leading to lagging or choppy movements, delayed responses from virtual entities, and disrupted interactions between users. These delays can break the immersion and fluidity of the metaverse, making it difficult for users to enjoy a smooth and interactive experience.
The effects of network latency on the metaverse are particularly pronounced in activities that require real-time synchronization [245], such as multiplayer gaming, live events, and social interactions. When latency is high, users may experience out-of-sync movements, communication breakdowns, and inconsistencies in the virtual environment. Lack of synchronization can lead to frustration and disengagement, undermining the metaverse’s overall effectiveness and appeal. Ensuring low latency is critical for maintaining the real-time responsiveness and immersive quality essential for a compelling metaverse experience. To address these challenges, advancements in network infrastructure, such as 5G technology and edge computing, are being leveraged to reduce latency and enhance the reliability of real-time interactions in the metaverse.

8.1.2. Bandwidth Limitations

Bandwidth limitation is the maximum rate at which data can be transmitted over a network. In the metaverse, bandwidth determines how much data can be exchanged between users and servers at any given time [246]. High-quality virtual experiences in the metaverse require transmitting large volumes of data, including high-resolution graphics, audio, and real-time interactive elements. When bandwidth is limited, these data transmissions can become bottlenecked, leading to slower loading times, reduced graphical fidelity, and overall degraded user experience.
The impact of bandwidth limitations on the metaverse is particularly significant during peak usage times or in areas with poor network infrastructure [247]. Users may experience lag, buffering, and disconnections, disrupting the immersive experience and hindering interactions within the virtual world [248]. Bandwidth constraints can also limit the scalability of the metaverse, as the infrastructure may struggle to support many concurrent users. To mitigate these issues, advancements in network technologies, such as fiber optics and 5G, are being deployed to increase bandwidth capacity [124]. Additionally, data compression and content delivery networks (CDNs) optimize data transmission and reduce the strain on bandwidth, ensuring a smoother and more reliable metaverse experience.
Table 1 highlights the progression of VR technologies by comparing bandwidth requirements, latency, and supported resolutions based on the latest published research. Wi-Fi 6E supports up to 8K resolution with bandwidth of up to 2.4 Gbps and approximately 20 ms latency, leveraging the 6 GHz band for enhanced performance in crowded environments [249]. Wi-Fi 7 further improves, offering bandwidth of up to 46 Gbps and latency of around 10 ms, also supporting up to 8K resolution with multi-link operations and enhanced spectrum efficiency [250]. Wi-Fi 8 is expected to make these features a lot better by using millimeter-wave technology to reach data rates up to 100 Gbps and latency levels below 1 ms. It will also support 8K and higher with features like beamforming and multiple access point (AP)-coordination [251,252]. Moreover, 5G networks provide ultra-fast speeds of 1–10 Gbps and latency between 1–10 ms, capable of smoothly streaming up to 8K VR content, especially in mobile environments [253]. The combination of NGCodec with 5G optimizes bandwidth usage through advanced video compression, reducing latency to under 5 ms while supporting up to 8K resolution [254]. Finally, FPGA-based VR streaming uses re-configurable hardware for fast video encoding and decoding, with latency below 1 ms and support for up to 8K resolution. This makes sure that users can interact and the system runs smoothly in real-time [255,256].

8.2. Blockchain in Real-Time Data Handling

Blockchain technology, while pivotal for decentralization and security in the metaverse, faces significant limitations when it comes to supporting real-time, high-frequency data exchanges [257]. Most blockchain systems, particularly those based on proof-of-work (PoW) consensus mechanisms, are designed to prioritize security and decentralization over speed. In platforms like Bitcoin and Ethereum, transaction processing times are measured in minutes, which creates unacceptable latency for real-time metaverse applications that require instant responses, such as virtual commerce, real-time collaboration, and live events. The number of transactions per second (TPS) that these networks can handle is limited—Bitcoin supports approximately 7 TPS, while Ethereum manages around 30 TPS [258]. This is inadequate when compared to traditional payment systems like Visa, which can handle over 24,000 TPS, illustrating blockchain’s current scalability issues. Even newer blockchain models, such as Ethereum’s transition to proof-of-stake (PoS) have improved efficiency but still face challenges in scaling to the demands of a real-time metaverse.
Additionally, the block size and transaction confirmation time exacerbate these scalability problems. Blockchains like Bitcoin limit the size of each block to 1 MB, and Ethereum’s average block size is similarly constrained, meaning only a small number of transactions can be processed in each block [259]. As the transaction volume increases—especially in a bustling metaverse with a large number of users interacting simultaneously—the network becomes congested, leading to increased transaction fees and even longer confirmation times. During periods of high demand, these delays and fees can render blockchain solutions impractical for applications where low latency and high throughput are critical, such as live virtual events or fast-moving virtual economies where digital assets must be exchanged instantaneously.
Moreover, blockchain’s inherent structure—where every transaction must be verified and recorded across a distributed ledger—further exacerbates latency issues in the metaverse [260]. The decentralized nature of blockchain, while crucial for security and autonomy, slows down transaction processing times due to the computational overhead required for consensus mechanisms. The synchronization of multiple nodes in a decentralized network to achieve consensus can lead to significant delays, rendering blockchain impractical for high-speed applications like real-time gaming or virtual events, where immediate feedback is necessary. For example, each transaction in Ethereum requires computationally intensive operations, such as cryptographic validation and the execution of smart contracts, which further delays the transaction finalization process [261]. In a real-time metaverse environment, where user interactions must be instantaneous to maintain immersion, these delays significantly degrade the user experience.
The energy consumption of PoW-based blockchain networks also poses a significant challenge [262]. The computational power required to solve cryptographic puzzles and validate transactions in PoW networks like Bitcoin leads to substantial energy usage, which not only limits scalability but also raises concerns about the environmental sustainability of blockchain in a highly interactive metaverse environment. Although alternative consensus mechanisms like PoS reduce energy consumption, they still face trade-offs in decentralization, as staking-based models can lead to a concentration of power among users with significant holdings, undermining the distributed, trustless nature that blockchain aims to provide.
Another limitation is the challenge of interoperability between different blockchain networks within the metaverse [263]. With a variety of blockchains supporting different applications and digital assets, ensuring seamless communication and transaction finalization across multiple blockchains becomes a major technical hurdle. Cross-chain interoperability is still in its infancy, and current solutions like bridges are often slow, susceptible to vulnerabilities, and require additional layers of trust, which defeats the purpose of decentralization. In a real-time metaverse scenario, where assets and data need to flow freely across platforms, these delays in cross-chain transactions introduce friction and reduce the fluidity of the virtual economy.
These limitations suggest that blockchain, in its current form, is not equipped to handle the high-frequency, low-latency data exchanges essential for the real-time metaverse. For blockchain to become a viable infrastructure for the metaverse, it must evolve to address these performance bottlenecks [264]. Emerging solutions, such as Layer 2 protocols—which enable off-chain processing to reduce congestion on the main chain—and sharding, which divides the blockchain into smaller, parallel chains, offer potential avenues for improvement. However, these solutions are still in the development stage and have not undergone the kind of testing that a fully developed real-time metaverse would require. Until blockchain technology can meet the demands of speed, scalability, and efficiency required by these virtual environments, alternative technologies like centralized databases or hybrid blockchain systems may need to fill the gap in supporting real-time, interactive experiences.

8.3. Interoperability and Standards

Interoperability and standards pose significant challenges in the development and functionality of the metaverse [265]. Interoperability refers to the ability of metaverse systems, platforms, and applications to work seamlessly together. The diversity of virtual environments, avatars, digital assets, and interaction mechanisms created by various developers and organizations makes achieving interoperability complex. Users might face difficulties transferring digital identities, assets, and experiences without common standards across virtual worlds. Digital fragmentation can hinder the metaverse’s vision of a unified, cohesive digital universe, limiting user engagement and stifling innovation.

8.3.1. Universal Standards

Establishing universal standards for the metaverse is another formidable challenge. Standards are essential to ensure consistency, security, and compatibility across various platforms and technologies that constitute the metaverse [96]. However, reaching a consensus on these standards involves coordination among numerous stakeholders, including tech companies, developers, regulatory bodies, and users. Each group may have different priorities and interests, making it difficult to agree on a unified set of rules and protocols. Additionally, rapid technological advancements and the evolving nature of the metaverse further complicate standardization efforts. The metaverse risks becoming a collection of isolated ecosystems without robust interoperability and standardized frameworks rather than a seamless, interconnected digital reality. This major challenge is divided into two parts: fragmentation and data compatibility.

8.3.2. Fragmentation

Fragmentation refers to multiple, incompatible systems and protocols within the metaverse, leading to isolated digital environments that cannot seamlessly interact with each other [56]. Product fragmentation arises when different developers and platforms adopt unique standards for creating and managing virtual worlds, avatars, digital assets, and user interactions. Without a unified approach, users face significant challenges when navigating between different parts of the metaverse, as their digital identities, assets, and experiences may not be transferable across these isolated ecosystems.
The fragmentation issue severely impacts the user experience and the overall potential of the metaverse [266]. For instance, a user might create an avatar and acquire virtual assets in one platform, only to find that these cannot be used or recognized in another platform. The lack of interoperability discourages users from investing time and resources into the metaverse, as they may be concerned about the portability and longevity of their digital assets. Moreover, developers are also hindered by fragmentation, as they must choose which standards to support or face the burden of trying to make their platforms compatible with multiple systems. The lack of universal standards stifles innovation and growth, preventing the metaverse from becoming the interconnected and expansive digital universe it aspires to be. Addressing fragmentation requires concerted efforts to establish and adopt common standards, ensuring that all parts of the metaverse can work together harmoniously.

8.3.3. Unified Standards

Unified standards are an asset of agreed-upon protocols, formats, and guidelines that ensure compatibility and interoperability across different systems and platforms within the metaverse [96]. These standards aim to eliminate the barriers created by fragmentation, allowing for seamless integration and interaction between diverse virtual environments, digital assets, avatars, and user experiences [267]. By adopting unified standards, developers and platforms can ensure that users can move fluidly between different parts of the metaverse without losing functionality or having to recreate their digital presence.
Implementing unified standards is crucial for the metaverse to achieve its vision of a cohesive and interconnected digital universe [90,268]. These standards would cover various aspects such as data formats, communication protocols, security measures, and user interface guidelines. For example, a standardized avatar format would allow users to maintain the same digital identity across different virtual worlds. In contrast, standardized asset formats would ensure that digital goods purchased or created in one environment can be used in another. Unified standards thus foster a more inclusive and expansive metaverse, enhancing user experience, encouraging innovation, and promoting collaboration among developers and platforms.

8.3.4. Cross-Platform Compatibility

Cross-platform compatibility is the ability of different virtual environments, systems, and applications within the metaverse to work together seamlessly, regardless of the underlying platform or technology [269]. This means that users can access and interact with various parts of the metaverse using different devices (e.g., PCs, VR headsets, mobile phones, etc.) and still have a consistent and unified experience. Cross-platform compatibility ensures that digital assets, avatars, user data, and interactions can be transferred and recognized across different systems without losing functionality or user experience.
Addressing cross-platform compatibility is essential for overcoming fragmentation in the metaverse. It allows users to navigate between different virtual worlds and applications without encountering barriers or needing to adapt to different interfaces or standards [219]. For example, a user should be able to purchase a digital item in one virtual world and use it in another, regardless of which platform each world is built on. Cross-platform compatibility involves adopting common standards and protocols that enable different systems to communicate and share data effectively. Cross-platform integration enhances the user experience by providing continuity and coherence across the metaverse, fostering greater engagement and participation from a broader audience.

8.3.5. Data Compatibility

Data compatibility ensures that different systems, platforms, and applications can effectively share, interpret, and use data across various environments [270]. These issues arise when there is no common format or protocol for data representation, storage, and exchange, leading to difficulties in integrating and synchronizing data across different virtual worlds and applications.
In the metaverse, data compatibility is essential for a seamless user experience [18]. Users create and interact with various digital assets, including avatars, virtual goods, and user-generated content. When these assets are created in one environment but incompatible with others, users face significant barriers in transferring or utilizing their data across different platforms [10]. For instance, an avatar designed in one virtual world might not be recognized, or digital assets purchased in one marketplace might not be usable in another due to differences in data formats and structures.
The lack of data compatibility hampers the metaverse’s potential as a unified and interconnected digital universe [271]. To address these issues, there needs to be a concerted effort to develop and adopt common data standards and protocols. These standards would define how data should be formatted, stored, and exchanged, ensuring that it can be seamlessly interpreted and used across different systems. By achieving data compatibility, the metaverse can provide a more cohesive and integrated experience, allowing users to move fluidly between different environments and fully leverage their digital assets across the entire ecosystem. Common formats and data integration are the most important factors of data compatibility [267].
Common formats in data compatibility within the metaverse are essential for ensuring seamless interoperability across various platforms, applications, and virtual environments [146,223]. These standardized formats facilitate the consistent representation, storage, and exchange of data, allowing different systems to interpret and use the same data efficiently. Common formats are critical in avatars, digital identities, digital assets, virtual goods, user data, profiles, virtual currency, transactions, communication, interaction data, environments, and world data.
Common formats like FBX (Filmbox), OBJ (wavefront object), and glTF (GL transmission format) are widely used for 3D models and animations for avatars and digital identities. These formats ensure that avatars created in one virtual world can be rendered and animated correctly in another [272]. Metadata formats such as JSON (JavaScript object notation) and XML (extensible markup language) are essential for including information about avatars, such as appearance, accessories, and customizations, ensuring consistent interpretation across platforms. Standardizing these formats helps maintain the visual and functional integrity of avatars and digital identities as they move across different virtual environments.
For digital assets and virtual goods, standardized formats for textures and materials—such as PNGs (portable network graphics), JPEG (joint photographic experts group) for textures, and PBR (physically based rendering) materials—ensure that the visual quality of digital assets is maintained across various platforms [273]. Asset bundles packaged in standard formats like Unity Asset Bundles or Unreal Engine’s asset format facilitate the transfer and usage of digital assets across different virtual worlds. Additionally, blockchain standards like ERC-20 or ERC-721 for tokens ensure that virtual currencies and digital collectibles can be managed and exchanged securely and interoperably [274].

8.3.6. Data Integration

Data integration is equally important for user data and profiles, virtual currency and transactions, and communication and interaction data [275]. Standard formats for user profiles, preferences, and settings (JSON or XML) enable consistent user experiences across different platforms. Adopting blockchain protocols like ERC-20 or ERC-721 for virtual currencies and transactions ensures secure and interoperable management of digital assets [274]. In terms of communication and interaction data, standardized protocols like XMPP (extensible messaging and presence protocol) or WebRTC (web real-time communication) ensure that text, voice, and video communication services are interoperable across different platforms [276]. Using standard formats for describing virtual environments, such as XML-based X3D or JSON-based Scene Graphs, also ensures that different platforms can render and interact with these environments consistently [277].
By adopting these common formats and data integration standards, the metaverse can achieve a higher level of interoperability, enabling a more cohesive and integrated user experience. This approach not only facilitates the seamless transfer of digital assets and data across different virtual environments but also enhances user engagement and satisfaction. As the metaverse continues to evolve, the importance of standardized data formats and integration protocols will only grow, ensuring that users enjoy a unified and immersive digital universe.

8.4. Scalability

The scalability of immersive technologies, such as VR and AR, is a major concern for the metaverse. Immersive platforms are highly resource-intensive, requiring vast amounts of computational power, bandwidth, and storage to render detailed, interactive 3D environments in real-time [278,279]. As the user base grows, the system’s ability to support an increasing number of concurrent users without compromising performance becomes a significant challenge. For instance, social VR platforms like VRChat or Facebook’s Horizon Worlds often experience server overloads, resulting in degraded user experience due to lag, buffering, or crashes when large numbers of users are interacting simultaneously. The fact that immersive experiences necessitate the transmission of high-resolution 3D content, which necessitates significant bandwidth to maintain seamless user interaction, makes this problem worse. For example, high-resolution 3D models, complex textures, and lighting effects need to be transmitted and rendered in real-time, which places a heavy burden on both the network infrastructure and the user’s hardware.
The need for real-time synchronization of users’ actions and environmental changes within the metaverse further complicates the scalability issue [280]. In a fully immersive experience, every user’s movement, interaction, and even facial expressions must be reflected in real-time for all other participants. This requires not only high bandwidth but also extremely low latency to avoid noticeable lag, which can severely disrupt immersion. The concept of “presence”—a core aspect of VR that makes users feel as though they are truly inside a virtual environment—depends on these real-time interactions being smooth and instantaneous. Any delay in transmitting data between users or in rendering updates to the virtual world can break the illusion, reducing the effectiveness of the immersive experience.
While current network infrastructures have seen improvements with the advent of 5G, they still face limitations in supporting the data-heavy demands of immersive systems [85]. 5G promises ultra-low latency (around 1 millisecond) and higher data rates, but even this technology may struggle under the demands of a fully populated metaverse, where thousands or even millions of users interact simultaneously. For example, the bandwidth required for real-time rendering of high-definition 3D environments and live interactions can exceed what current 5G networks can handle at scale. In large virtual events, such as live concerts or digital sports games, the number of concurrent users may push network capacity beyond its limits, resulting in a laggy or interrupted experience for many participants.
In addition to network constraints, processing power remains a critical bottleneck [281]. Rendering real-time 3D environments, especially at the quality expected in high-end VR applications, requires significant computational resources. Current GPUs (graphics processing units) are already strained in handling the complex tasks required for real-time ray tracing, realistic physics simulations, and AI-driven interactions. In a fully immersive metaverse, these requirements scale exponentially as the number of users increases. Real-time data fusion, which integrates inputs from multiple sensors (e.g., motion tracking, facial recognition, and environmental mapping), requires extensive computational power. This is especially true for systems aiming to provide haptic feedback or real-world physics, where even minor delays can degrade the overall experience.
Additionally, the diversity of devices and platforms presents another scalability challenge. Users will access the metaverse using a wide range of devices, from high-end VR headsets to mobile phones and desktop computers [267]. Each device has different processing capacities, display resolutions, and input methods, which makes it difficult to standardize the user experience. For example, a user on a low-end mobile device may experience lower-quality graphics and longer load times compared to a user on a high-end VR system, leading to disparities in the immersive experience. This inconsistency in performance can limit the scalability of the metaverse, as developers must balance the need for a high-quality experience with the technical limitations of diverse hardware. Furthermore, ensuring real-time synchronization across these varied devices without compromising the immersive experience for any user group requires sophisticated optimization techniques that are still in development.
Another significant factor is cloud infrastructure. While cloud computing can alleviate some of the processing burdens by offloading tasks to remote servers, the latency introduced by cloud-based rendering or computation can still be problematic for real-time applications [282]. Even with edge computing—where processing is done closer to the user—scalability issues remain. Edge servers must be distributed widely enough to handle the demand from local users, but deploying and maintaining this infrastructure at scale is costly and complex. Moreover, data synchronization between multiple edge servers and the central cloud can lead to consistency issues, where some users experience delays in receiving updates to the virtual world. This could be particularly problematic in environments that require precise timing, such as competitive gaming or virtual commerce platforms where milliseconds matter.

8.5. User Experience

User experience (UX) is a critical issue in the metaverse, as it directly impacts user engagement and satisfaction [283]. A seamless and intuitive UX is essential for making virtual environments enjoyable and accessible to a wide audience. However, creating an optimal UX in the metaverse is challenging due to the complexity of immersive technologies, the diversity of user needs, and the potential for adverse effects such as motion sickness [20]. Ensuring that users can navigate, interact, and communicate effectively within the metaverse requires careful design and continuous improvement based on user feedback.

8.5.1. Designing Intuitive Interfaces for Non-Expert Users

One of the greatest challenges in the real-time metaverse is ensuring that the environment is accessible to non-expert users [284]. Immersive technologies such as VR, AR, and MR (mixed reality) introduce a steep learning curve for many users who are not familiar with 3D spaces, controllers, or XR devices. For the mass adoption of the metaverse, intuitive and user-friendly interfaces are essential, allowing people to effortlessly interact with the virtual environment, regardless of their technical expertise. Simplifying the onboarding process is a critical first step. For example, using familiar UI paradigms such as gesture-based controls (e.g., swiping, pinching, or pointing) or natural language voice commands can help non-expert users engage with immersive environments more comfortably [285]. Furthermore, integrating in-world tutorials and AI-driven assistance (e.g., voice-guided navigation) can ensure that users are not overwhelmed by the complexity of virtual interactions.
Another important consideration is the use of contextual interfaces that adapt based on the user’s needs and experience level. For instance, advanced users may prefer more customizable options and complex interactions, while beginners might benefit from simplified menus and guidance systems [286]. Developers should also focus on minimizing the cognitive load on users by ensuring clear visual hierarchies, intuitive navigation paths, and consistent iconography that aligns with user expectations. As XR platforms continue to evolve, standardization of these interfaces across devices and applications will be key to reducing fragmentation and ensuring a cohesive user experience across the metaverse.

8.5.2. Addressing Physical and Mental Health Risks

The immersive technologies of the metaverse, specifically VR and AR, present significant physical and mental health challenges. Motion sickness, also known as cybersickness in virtual environments, is a significant UX issue in the metaverse [287]. It occurs when there is a disconnect between the visual input from the virtual environment and the user’s physical sensations, leading to symptoms such as nausea, dizziness, and disorientation [67]. Furthermore, users face risks of injury or accidents due to attentional distractions, especially with AR applications that blend digital content with real-world environments [288]. Psychological risks include potential aftereffects like intensified emotional responses or confusion about reality, particularly for vulnerable users like children or those with certain mental health conditions [289]. The phenomenon of “super-realism” further complicates matters, as experiences in highly realistic virtual environments may blur the lines between virtual and real experiences, potentially causing users to develop emotional or cognitive disturbances when transitioning back to the physical world [290].
To address cybersickness, developers can implement several strategies. Optimizing frame rates (ideally above 90 frames per second) is essential to ensure smooth rendering and reduce lag and visual inconsistencies, which are major contributors to motion sickness [291]. Likewise, minimizing latency in user inputs and head tracking (targeting less than 20 milliseconds) can help maintain a responsive experience, ensuring that virtual movements feel synchronized with physical actions. Designing predictable and smooth motion paths within virtual environments also helps reduce the likelihood of motion sickness. For example, rather than sudden shifts in perspective, developers can implement gradual transitions that simulate natural acceleration and deceleration, allowing users’ senses to adjust to the changes in motion. Providing users with customizable comfort settings—such as adjustable field of view (FOV), camera smoothing, and snap-turning (e.g., a technique that reduces rotational motion)—can help users tailor their experience to their comfort levels, significantly reducing the risk of cybersickness [292]. Furthermore, teleportation (e.g., jumping from one location to another without continuous movement) has emerged as a popular locomotion method in VR that mitigates motion sickness by eliminating the disconnect between visual and physical motion. Offering multiple locomotion modes—such as teleportation, smooth walking, or vehicle simulation—allows users to choose their preferred method of movement based on their susceptibility to cybersickness [293].

8.5.3. Accessibility and Inclusivity in the Metaverse

Accessibility in the metaverse is another crucial aspect of UX that ensures inclusivity for users with diverse abilities and needs. Ensuring that the metaverse is accessible to all users, including those with disabilities, is a crucial aspect of UX design [294]. The immersive nature of the metaverse provides new opportunities for inclusivity, but it also introduces challenges in accommodating users with physical, sensory, and cognitive impairments. Accessibility should not be an afterthought but a core element of design, ensuring that everyone can participate fully in the metaverse. For users with mobility impairments, developers must provide multiple input methods, such as adaptive controllers, gesture-based inputs, and voice commands [295]. These options ensure that users who may not be able to use traditional controllers or keyboards can still navigate and interact with virtual environments.
Moreover, environments should be designed to support different movement speeds, seated play, and adjustable reach heights to accommodate users with limited physical mobility. For individuals with visual impairments, text-to-speech (TTS) features can enable the conversion of on-screen text into spoken words, while haptic feedback can provide tactile cues to simulate touch and interaction [296]. Audio descriptions of visual elements, such as describing the environment, actions, or characters, can further enhance the experience for users with limited sight. On the other hand, for users with hearing impairments, integrating speech-to-text (STT) features allows spoken dialogue to be transcribed in real-time, ensuring that users can engage with conversations and activities in the metaverse [297]. Additionally, ensuring compatibility with hearing aids or cochlear implants could be an important consideration for creating more inclusive soundscapes. Cognitive accessibility is another critical area. For users with cognitive disabilities, such as dyslexia, autism spectrum disorders, or attention deficits, virtual environments must be designed to minimize confusion and overload [298]. Features such as clear navigation cues, adjustable text sizes, simple language options, and color contrast settings can make the metaverse easier to understand and navigate for these users. Additionally, customizable UI layouts that allow users to control the complexity and frequency of information displayed could make virtual environments more accommodating to a wider range of cognitive abilities.

8.5.4. Social Interaction and Complexity in the Metaverse

Social interaction is one of the most promising but also challenging aspects of the metaverse [299]. It allows users to connect with others, form relationships, collaborate, and engage in shared experiences across vast virtual spaces. However, the complexity of social interactions in immersive environments brings its own set of UX challenges. In physical reality, social cues such as eye contact, body language, and tone of voice are vital for smooth communication, but replicating these nuances in a virtual environment remains difficult. Developers need to focus on creating natural communication systems that replicate these subtle social cues as much as possible [300]. For instance, spatial audio—where sound emanates from the direction of a virtual avatar—helps users better engage with conversations, creating a sense of presence and immersion. Facial tracking and emotion mapping (enabled by newer VR/AR headsets) can also enhance avatar realism, making social interactions more intuitive.
However, there are potential downsides to social interaction in the metaverse, such as the risk of harassment or exclusion [301]. Developers need to implement safety features such as personal space bubbles, mute/block functions, and reporting mechanisms to ensure that users feel safe while interacting with others in the virtual world. Moreover, AI-driven content moderation systems can help identify and mitigate toxic behaviors in real-time, ensuring a positive social experience for all users. Also, creating inclusive social spaces that accommodate users of diverse backgrounds, abilities, and cultures is key to the success of the metaverse. By facilitating customizable avatars that represent various ethnicities, body types, and disabilities, developers can promote diversity and ensure that all users feel represented and included in virtual environments.

8.6. Security and Privacy

Security and privacy issues in the metaverse are significant concerns due to the vast amount of personal and sensitive data generated and exchanged within virtual environments [302]. User interactions, transactions, and personal information, including biometric data from VR/AR devices, are vulnerable to breaches and misuse. The immersive nature of the metaverse adds complexity to security challenges, as it involves real-time data exchange and extensive user tracking to deliver personalized experiences. Ensuring the security and privacy of the data is critical to maintaining user trust and safeguarding against potential threats such as identity theft, data mining, or targeted cyberattacks. The rapid growth of metaverse technologies requires equally dynamic solutions to ensure the long-term security of these ecosystems.
Addressing data security in the metaverse requires robust encryption methods to protect data in transit and at rest. By utilizing cutting-edge cryptographic techniques, one can increase security and prevent malicious actors from intercepting or changing data exchanged between users and servers [303]. With the real-time nature of data exchanges in the metaverse, traditional encryption schemes may not always be efficient, especially when large volumes of 3D-rendered content, live interactions, and financial transactions occur simultaneously. Therefore, end-to-end encryption (E2EE) becomes vital to ensure that only the intended recipients can access the data, securing it against eavesdropping and tampering [304]. Employing multi-factor authentication (MFA) and secure login protocols can prevent unauthorized access to user accounts [305]. Regular security audits and vulnerability assessments are essential to identify and address potential weaknesses in the system. Developers must also establish protocols for incident response to quickly mitigate the effects of any security breaches.
Advanced cryptographic techniques, such as homomorphic encryption, are increasingly explored to enhance data security without compromising the immersive experience [81]. Homomorphic encryption allows computations to be performed on encrypted data without needing to decrypt it first, meaning that sensitive user data can be processed without exposing it to external parties, including service providers. This can be particularly useful in the metaverse for applications that involve personal data analytics or real-time decision-making. Another promising direction is the use of blockchain technology, which can enhance security through decentralization. Blockchain allows for the secure, tamper-resistant storage of transaction data, digital assets, and even user identity credentials [60]. For example, virtual assets like NFTs (non-fungible tokens) can be securely traded within the metaverse using blockchain, ensuring transparent and secure ownership. In addition to encryption and blockchain, employing multi-factor authentication (MFA) and secure login protocols helps prevent unauthorized access to user accounts. Users in the metaverse may handle sensitive data, such as virtual banking details or personal conversations [306]. By requiring multiple authentication methods—such as a password in combination with biometric verification (e.g., fingerprint, facial recognition)—MFA can significantly reduce the risks of unauthorized access. Moreover, regular security audits and vulnerability assessments should be integrated into the development lifecycle to identify and resolve weaknesses in the system.
Privacy concerns in the metaverse are equally critical, particularly as platforms may collect a wide range of personal data, from location data to behavioral patterns and biometric information (e.g., eye movement, and heart rate) [307]. The amount of data that users generate while interacting in immersive environments makes them more vulnerable to surveillance, profiling, and unauthorized data sharing. One of the central challenges in preserving user privacy is maintaining a balance between collecting data necessary for delivering personalized experiences and protecting user anonymity. To address privacy concerns, developers must implement privacy-preserving technologies such as zero-knowledge proofs (ZKPs) [308]. ZKPs allow one party to prove to another that they know a value (e.g., authentication credentials or ownership of an asset) without revealing the actual value itself. This could be crucial for ensuring that users’ sensitive data, such as personal credentials or virtual asset ownership, can be verified without exposing unnecessary information. For instance, in a virtual marketplace, a ZKP-based system could allow users to verify ownership of an NFT without revealing the details of the transaction history, thereby enhancing user privacy while ensuring security.
Another approach is differential privacy, a technique that adds statistical noise to data, ensuring that individual users’ actions cannot be easily distinguished when data are aggregated [309]. This technique is useful in the metaverse for collecting aggregate data (general user behavior trends) while preserving the anonymity of individual users. Homomorphic encryption, as mentioned earlier, also offers a powerful privacy solution by allowing computations on encrypted data, ensuring that service providers never have access to users’ unencrypted personal data. The implementation of blockchain-based identity management systems offers another layer of privacy protection. With self-sovereign identity (SSI) models, users maintain control over their digital identities, deciding which aspects of their data to share and with whom [265]. Instead of centralized platforms owning and controlling user identity information, SSI gives users the autonomy to manage their credentials and personal data using decentralized identity solutions on blockchain networks. However, these decentralized models also carry risks. For example, if a user loses access to their private keys, they may be permanently locked out of their identity and assets, highlighting the need for secure key management solutions within the metaverse.
To further empower users, developers should adopt privacy-by-design principles, embedding privacy considerations directly into the development process from the outset [310]. This includes implementing transparent data handling practices and giving users granular control over their data, including what information is shared, how long it is stored, and with whom it is shared. For example, VR and AR applications could allow users to control which biometric data (e.g., eye movement, hand tracking) is processed and for what purposes [311]. In the regulatory space, compliance with global data protection laws such as the European Union’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) is essential for building trust [310]. These regulations mandate that users are informed about how their data are collected, used, and stored, and give them the ability to opt out of certain types of data collection or request the deletion of their data. Adhering to such standards is particularly important in the metaverse, where sensitive data like real-time geolocation and biometric information could be exposed.
Figure 11 depicts various security challenges associated with the metaverse, focusing on the interactions between the physical and virtual worlds and the cloud infrastructure. In the physical world, users use VR headsets, smartphones, and laptops to access the virtual world. Data transmission between these devices and the cloud, which supports the metaverse, is vulnerable to packet loss and cyber-attacks. Hackers can exploit these vulnerabilities through methods like “Man in the Middle” attacks, where they intercept and potentially alter data being transmitted, and denial-of-service (DoS) attacks, which can disrupt services by overwhelming the network. These security threats can lead to compromised user experiences and data integrity in the virtual world, emphasizing the need for robust security measures to protect metaverse interactions.

9. Conclusions

The real-time metaverse represents a transformative evolution in how virtual environments are designed and experienced. Unlike traditional metaverse spaces, which often rely on static, pre-rendered environments, the real-time metaverse integrates real-world data to create dynamic and responsive digital landscapes. This seamless fusion of the physical and digital worlds opens up unprecedented possibilities across industries like education, healthcare, entertainment, and business. By leveraging technologies such as artificial intelligence (AI), virtual and augmented reality (VR/AR), blockchain, and advanced networking infrastructures like 5G and edge computing, the real-time metaverse can deliver immersive, interactive experiences that evolve in sync with real-world events.
Efforts to enhance the metaverse’s infrastructure are crucial for its success. Incorporating edge computing is one such effort, as it brings data processing closer to the user, thereby reducing latency and improving response times. Additionally, developing robust interoperability standards is essential to ensure seamless communication and integration across different platforms and devices. These advancements will enable a more cohesive and efficient digital ecosystem, allowing users to transition smoothly between virtual environments and experiences.
However, the development of a fully functional real-time metaverse is not without its challenges. Latency and bandwidth issues remain critical hurdles, especially when attempting to deliver high-quality, real-time experiences to a large number of users. The scalability of this system depends heavily on the continuous improvement of network infrastructure, including the potential future integration of 6G technology. Additionally, there are significant security and privacy concerns, particularly regarding the handling of personal data and digital assets within these virtual spaces. Without robust privacy measures and data protection protocols, the real-time metaverse could face issues of trust and security, hindering its broader adoption.
As the metaverse evolves, industry, academia, and government collaborative efforts will be necessary to realize its transformative capabilities fully. Such collaboration will drive innovation and address existing barriers, ensuring the metaverse becomes a vibrant, user-friendly, and secure digital ecosystem. By focusing on these social, technical, and governance issues, the real-time metaverse can revolutionize how we interact, work, and learn, creating new opportunities for innovation and connectivity in the digital age.
Looking forward, future research in this area should focus on several key domains. First, the development of interoperability standards is crucial to ensure that different platforms, devices, and virtual environments can communicate and interact seamlessly. Research into 6G technology, AI-driven resource management, and edge computing is also essential for overcoming latency issues and enabling scalable, real-time applications. Another important area of study is the integration of multisensory feedback, such as haptics, smell, and taste, to further enhance the immersive quality of virtual experiences. Lastly, addressing the ethical and social challenges of the real-time metaverse, such as digital harassment and equitable representation, will be crucial for ensuring the technology’s responsible and inclusive growth.

Author Contributions

Conceptualization, M.H., Q.Q. and Y.C.; methodology, M.H., Q.Q., Y.C. and H.K.; software, M.H. and Q.Q.; validation, M.H., Q.Q. and Y.C.; formal analysis, M.H. and Q.Q.; investigation, Y.C. and H.K.; resources, Y.C. and E.A.-C.; data curation, M.H. and Q.Q.; writing—original draft preparation, M.H., Q.Q. and Y.C.; writing—review and editing, Y.C., H.K., E.B. and E.A.-C.; visualization, M.H. and Q.Q.; supervision, Y.C. and E.B.; project administration, E.A.-C.; funding acquisition, Y.C. and E.A.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was partially funded by the Air Force Research Laboratory under SFFP FA9550-20-F-0005.

Data Availability Statement

Data are available following the rules of the AFRL and Binghamton University.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
3Dthree-dimensional
5Gfifth-generation
6Gsixth-generation
AIartificial intelligence
APaccess point
APIapplication programming interface
ARaugmented reality
AWSAmazon Web Services
CCPACalifornia Consumer Privacy Act
CDNcontent delivery network
DMLdeep and meaningful learning
CUDAcompute unified device architecture
DTdigital twin
DTsdigital twins
DoSdenial-of-service
DLSSdeep learning super sampling
E2EEend-to-end encryption
FBXFilmbox
FOVfield of view
GCPGoogle Cloud Services
GDPRGeneral Data Protection Regulation
glTFGL transmission format
GPUgraphics processing unit
GRgraphical rendering
HPChigh-performance computing
HL7Health Level Seven
HTMLhypertext markup language
HTTPhypertext transfer protocol
IaaSinfrastructure as a service
IoTInternet of Things
JPEGJoint Photographic Experts Group
JSONJavaScript Object Notation
LLMlarge language model
MFAmulti-factor authentication
MIMOmultiple-input multiple-output
MLmachine learning
mMTCmassive machine-type communications
MPImessage passing interface
MRmixed reality
NFTsnon-fungible tokens
NLPnatural language processing
NPCnon-player character
OBJwavefront object
OpenMPopen multi-processing
PaaSplatform-as-a-service
PBRphysically based rendering
PNGportable network graphic
PoSproof-of-stake
PoWproof-of-work
RBGred, green, and blue
SaaSsoftware as a service
SSIself-sovereign identity
STTspeech-to-text
SVREssocial virtual reality environments
TCP/IPtransmission control protocol/internet protocol
TPStransactions per second
TTStext-to-speech
URLLCultra-reliable low-latency communications
UXuser experience
VRvirtual reality
W3CWorld Wide Web Consortium
WebGLWeb Graphics Library
WebRTCweb real-time communication
WebXRweb extended reality
XMLextensible markup language
XMPPextensible messaging and presence protocol
ZKPszero-knowledge proofs

References

  1. Wang, Y.; Su, Z.; Zhang, N.; Xing, R.; Liu, D.; Luan, T.H.; Shen, X. A survey on metaverse: Fundamentals, security, and privacy. IEEE Commun. Surv. Tutor. 2022, 25, 319–352. [Google Scholar] [CrossRef]
  2. Li, K.; Cui, Y.; Li, W.; Lv, T.; Yuan, X.; Li, S.; Ni, W.; Simsek, M.; Dressler, F. When internet of things meets metaverse: Convergence of physical and cyber worlds. IEEE Internet Things J. 2022, 10, 4148–4173. [Google Scholar] [CrossRef]
  3. Setiawan, K.D.; Anthony, A.; Meyliana; Surjandy. The essential factor of metaverse for business based on 7 layers of metaverse—Systematic literature review. In Proceedings of the 2022 International Conference on Information Management and Technology (ICIMTech), Semarang, Indonesia, 11–12 August 2022; pp. 687–692. [Google Scholar]
  4. Tyagi, A.K. Decentralized everything: Practical use of blockchain technology in future applications. In Distributed Computing to Blockchain; Elsevier: Amsterdam, The Netherlands, 2023; pp. 19–38. [Google Scholar]
  5. Truong, V.T.; Le, L.; Niyato, D. Blockchain meets metaverse and digital asset management: A comprehensive survey. IEEE Access 2023, 11, 26258–26288. [Google Scholar] [CrossRef]
  6. Zhu, H.Y.; Hieu, N.Q.; Hoang, D.T.; Nguyen, D.N.; Lin, C.T. A human-centric metaverse enabled by brain-computer interface: A survey. IEEE Commun. Surv. Tutor. 2024, 26, 2120–2145. [Google Scholar] [CrossRef]
  7. Park, S.M.; Kim, Y.G. A metaverse: Taxonomy, components, applications, and open challenges. IEEE Access 2022, 10, 4209–4251. [Google Scholar] [CrossRef]
  8. Liu, S.; Xie, J.; Wang, X. QoE enhancement of the industrial metaverse based on Mixed Reality application optimization. Displays 2023, 79, 102463. [Google Scholar] [CrossRef]
  9. Yang, Y.; Ma, M.; Van Doan, T.; Hulin, T.; Fitzek, H.F.; Nguyen, G.T. Touch the Metaverse: Demonstration of Haptic Feedback in Network-Assisted Augmented Reality. In Proceedings of the 2024 IEEE International Conference on Pervasive Computing and Communications Workshops and Other Affiliated Events (PerCom Workshops), Biarritz, France, 11–15 March 2024; pp. 379–381. [Google Scholar]
  10. Mystakidis, S. Metaverse. Encyclopedia 2022, 2, 486–497. [Google Scholar] [CrossRef]
  11. Singla, B.; Shalender, K.; Singh, N. Creator’s Economy in Metaverse Platforms: Empowering Stakeholders Through Omnichannel Approach: Empowering Stakeholders Through Omnichannel Approach; IGI Global: Hershey, PA, USA, 2024. [Google Scholar]
  12. Park, S.; Kim, S. Identifying world types to deliver gameful experiences for sustainable learning in the metaverse. Sustainability 2022, 14, 1361. [Google Scholar] [CrossRef]
  13. Mancuso, I.; Petruzzelli, A.M.; Panniello, U. Digital business model innovation in metaverse: How to approach virtual economy opportunities. Inf. Process. Manag. 2023, 60, 103457. [Google Scholar] [CrossRef]
  14. Xu, M.; Ng, W.C.; Lim, W.Y.B.; Kang, J.; Xiong, Z.; Niyato, D.; Yang, Q.; Shen, X.; Miao, C. A full dive into realizing the edge-enabled metaverse: Visions, enabling technologies, and challenges. IEEE Commun. Surv. Tutor. 2022, 25, 656–700. [Google Scholar] [CrossRef]
  15. Dwivedi, Y.K.; Hughes, L.; Baabdullah, A.M.; Ribeiro-Navarrete, S.; Giannakis, M.; Al-Debei, M.M.; Dennehy, D.; Metri, B.; Buhalis, D.; Cheung, C.M.; et al. Metaverse beyond the hype: Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. Int. J. Inf. Manag. 2022, 66, 102542. [Google Scholar] [CrossRef]
  16. Qu, Q.; Hatami, M.; Xu, R.; Nagothu, D.; Chen, Y.; Li, X.; Blasch, E.; Ardiles-Cruz, E.; Chen, G. The microverse: A task-oriented edge-scale metaverse. Future Internet 2024, 16, 60. [Google Scholar] [CrossRef]
  17. Gadekallu, T.R.; Huynh-The, T.; Wang, W.; Yenduri, G.; Ranaweera, P.; Pham, Q.V.; da Costa, D.B.; Liyanage, M. Blockchain for the metaverse: A review. arXiv 2022, arXiv:2203.09738. [Google Scholar]
  18. Ersoy, M.; Gürfidan, R. Blockchain-based asset storage and service mechanism to metaverse universe: Metarepo. Trans. Emerg. Telecommun. Technol. 2023, 34, e4658. [Google Scholar] [CrossRef]
  19. Huang, H.; Wu, J.; Zheng, Z. From Blockchain to Web3 and Metaverse; Springer: Berlin/Heidelberg, Germany, 2023. [Google Scholar]
  20. Dincelli, E.; Yayla, A. Immersive virtual reality in the age of the Metaverse: A hybrid-narrative review based on the technology affordance perspective. J. Strateg. Inf. Syst. 2022, 31, 101717. [Google Scholar] [CrossRef]
  21. Wang, H.; Ning, H.; Lin, Y.; Wang, W.; Dhelim, S.; Farha, F.; Ding, J.; Daneshmand, M. A survey on the metaverse: The state-of-the-art, technologies, applications, and challenges. IEEE Internet Things J. 2023, 10, 14671–14688. [Google Scholar] [CrossRef]
  22. Kala, V.; Chudasama, S.; Bathiya, J.; Jain, D.; Barik, P.K. Optimized 5G Resource Allocation for Metaverse Applications. In Proceedings of the 2024 International Conference on Electronics, Computing, Communication and Control Technology (ICECCC), Kuala Lumpur, Malaysia, 22–24 March 2024; pp. 1–6. [Google Scholar]
  23. Cardoso, L.F.d.S.; Kimura, B.Y.L.; Zorzal, E.R. Towards augmented and mixed reality on future mobile networks. Multimed. Tools Appl. 2024, 83, 9067–9102. [Google Scholar] [CrossRef]
  24. Huynh-The, T.; Pham, Q.V.; Pham, X.Q.; Nguyen, T.T.; Han, Z.; Kim, D.S. Artificial intelligence for the metaverse: A survey. Eng. Appl. Artif. Intell. 2023, 117, 105581. [Google Scholar] [CrossRef]
  25. Zhao, Y.; Jiang, J.; Chen, Y.; Liu, R.; Yang, Y.; Xue, X.; Chen, S. Metaverse: Perspectives from graphics, interactions and visualization. Vis. Inform. 2022, 6, 56–67. [Google Scholar] [CrossRef]
  26. Hadjiaros, M.; Shimi, A.; Avraamides, M.N.; Neokleous, K.; Pattichis, C.S. Virtual Reality Brain-Computer Interfacing and the role of cognitive skills. IEEE Access 2024, 12, 129240–129261. [Google Scholar] [CrossRef]
  27. Park, K.S. Brain-computer interface. In Humans and Electricity: Understanding Body Electricity and Applications; Springer: Berlin/Heidelberg, Germany, 2023; pp. 223–248. [Google Scholar]
  28. Aloqaily, M.; Bouachir, O.; Karray, F.; Al Ridhawi, I.; El Saddik, A. Integrating digital twin and advanced intelligent technologies to realize the metaverse. IEEE Consum. Electron. Mag. 2022, 12, 47–55. [Google Scholar] [CrossRef]
  29. Padmanaban, S.; Nasab, M.A.; Hatami, M.; Milani, O.H.; Dashtaki, M.A.; Nasab, M.A.; Zand, M. The Impact of the Internet of Things in the Smart City from the Point of View of Energy Consumption Optimization. In Biomass and Solar-Powered Sustainable Digital Cities; Wiley: Hoboken, NJ, USA, 2024; pp. 81–122. [Google Scholar]
  30. Grande, R.; Albusac, J.; Vallejo, D.; Glez-Morcillo, C.; Castro-Schez, J.J. Performance Evaluation and Optimization of 3D Models from Low-Cost 3D Scanning Technologies for Virtual Reality and Metaverse E-Commerce. Appl. Sci. 2024, 14, 6037. [Google Scholar] [CrossRef]
  31. Singh, A.; Mishra, S.; Jain, S.; Dogra, S.; Awasthi, A.; Roy, N.R.; Sodhi, K. 8 Exploring practical use-cases of augmented reality using photogrammetry and other 3D reconstruction tools in the Metaverse. Augment. Virtual Real. Ind. 5.0 2023, 2, 163. [Google Scholar]
  32. Aslam, A.M.; Chaudhary, R.; Bhardwaj, A.; Budhiraja, I.; Kumar, N.; Zeadally, S. Metaverse for 6G and beyond: The next revolution and deployment challenges. IEEE Internet Things Mag. 2023, 6, 32–39. [Google Scholar] [CrossRef]
  33. Hennig-Thurau, T.; Aliman, D.N.; Herting, A.M.; Cziehso, G.P.; Linder, M.; Kübler, R.V. Social interactions in the metaverse: Framework, initial evidence, and research roadmap. J. Acad. Mark. Sci. 2023, 51, 889–913. [Google Scholar] [CrossRef]
  34. Fascista, A. Toward integrated large-scale environmental monitoring using WSN/UAV/Crowdsensing: A review of applications, signal processing, and future perspectives. Sensors 2022, 22, 1824. [Google Scholar] [CrossRef]
  35. Zhao, F.; Zhang, C.; Geng, B. Deep Multimodal Data Fusion. ACM Comput. Surv. 2024, 56, 1–36. [Google Scholar] [CrossRef]
  36. Yu, J.; Alhilal, A.; Hui, P.; Tsang, D.H. Bi-directional digital twin and edge computing in the metaverse. IEEE Internet Things Mag. 2024, 7, 106–112. [Google Scholar] [CrossRef]
  37. Li, K.; Lau, B.P.L.; Yuan, X.; Ni, W.; Guizani, M.; Yuen, C. Towards ubiquitous semantic metaverse: Challenges, approaches, and opportunities. IEEE Internet Things J. 2023, 10, 21855–21872. [Google Scholar] [CrossRef]
  38. Cao, X.; Chen, H.; Gelbal, S.Y.; Aksun-Guvenc, B.; Guvenc, L. Vehicle-in-Virtual-Environment (VVE) method for autonomous driving system development, evaluation and demonstration. Sensors 2023, 23, 5088. [Google Scholar] [CrossRef]
  39. Xu, H.; Berres, A.; Yoginath, S.B.; Sorensen, H.; Nugent, P.J.; Severino, J.; Tennille, S.A.; Moore, A.; Jones, W.; Sanyal, J. Smart mobility in the cloud: Enabling real-time situational awareness and cyber-physical control through a digital twin for traffic. IEEE Trans. Intell. Transp. Syst. 2023, 24, 3145–3156. [Google Scholar] [CrossRef]
  40. Kušić, K.; Schumann, R.; Ivanjko, E. A digital twin in transportation: Real-time synergy of traffic data streams and simulation for virtualizing motorway dynamics. Adv. Eng. Inform. 2023, 55, 101858. [Google Scholar] [CrossRef]
  41. Tuhaise, V.V.; Tah, J.H.M.; Abanda, F.H. Technologies for digital twin applications in construction. Autom. Constr. 2023, 152, 104931. [Google Scholar] [CrossRef]
  42. Gkontzis, A.F.; Kotsiantis, S.; Feretzakis, G.; Verykios, V.S. Enhancing urban resilience: Smart city data analyses, forecasts, and digital twin techniques at the neighborhood level. Future Internet 2024, 16, 47. [Google Scholar] [CrossRef]
  43. Ogunsakin, R.; Mehandjiev, N.; Marin, C.A. Towards adaptive digital twins architecture. Comput. Ind. 2023, 149, 103920. [Google Scholar] [CrossRef]
  44. Tian, H.; Lee, G.A.; Bai, H.; Billinghurst, M. Using virtual replicas to improve mixed reality remote collaboration. IEEE Trans. Vis. Comput. Graph. 2023, 29, 2785–2795. [Google Scholar] [CrossRef]
  45. Wang, C.M.; Huang, Q.J. Design of a Technology-Based Magic Show System with Virtual User Interfacing to Enhance the Entertainment Effects. Appl. Sci. 2024, 14, 5535. [Google Scholar] [CrossRef]
  46. Oh, S.; Chung, K.; Aliyu, I.; Hahn, M.; Jeong, I.K.; Yu, C.R.; Um, T.W.; Kim, J. XDN-Based Network Framework Design to Communicate Interaction in Virtual Concerts with Metaverse Platforms. Appl. Sci. 2023, 13, 9509. [Google Scholar] [CrossRef]
  47. Kim, D.Y.; Lee, H.K.; Chung, K. Avatar-mediated experience in the metaverse: The impact of avatar realism on user-avatar relationship. J. Retail. Consum. Serv. 2023, 73, 103382. [Google Scholar] [CrossRef]
  48. Hadi, R.; Melumad, S.; Park, E.S. The Metaverse: A new digital frontier for consumer behavior. J. Consum. Psychol. 2024, 34, 142–166. [Google Scholar] [CrossRef]
  49. Liu, Y.; Deng, Y.; Nallanathan, A.; Yuan, J. Machine learning for 6G enhanced ultra-reliable and low-latency services. IEEE Wirel. Commun. 2023, 30, 48–54. [Google Scholar] [CrossRef]
  50. Chen, W.; Milosevic, Z.; Rabhi, F.A.; Berry, A. Real-time analytics: Concepts, architectures and ML/AI considerations. IEEE Access 2023, 11, 71634–71657. [Google Scholar] [CrossRef]
  51. Sun, T.; Feng, B.; Huo, J.; Xiao, Y.; Wang, W.; Peng, J.; Li, Z.; Du, C.; Wang, W.; Zou, G.; et al. Artificial intelligence meets flexible sensors: Emerging smart flexible sensing systems driven by machine learning and artificial synapses. Nano-Micro Lett. 2024, 16, 14. [Google Scholar] [CrossRef] [PubMed]
  52. Sibley, J.; Swenson, A. Events in the metaverse. In Virtual Events Management: Theory and Methods for Event Management and Tourism; Goodfellow Pub Ltd.: Wolvercote, UK, 2023; p. 155. [Google Scholar]
  53. Onecha, B.; Cornadó, C.; Morros, J.; Pons, O. New approach to design and assess metaverse environments for improving learning processes in higher education: The case of architectural construction and rehabilitation. Buildings 2023, 13, 1340. [Google Scholar] [CrossRef]
  54. AlGerafi, M.A.; Zhou, Y.; Oubibi, M.; Wijaya, T.T. Unlocking the potential: A comprehensive evaluation of augmented reality and virtual reality in education. Electronics 2023, 12, 3953. [Google Scholar] [CrossRef]
  55. Huynh-The, T.; Gadekallu, T.R.; Wang, W.; Yenduri, G.; Ranaweera, P.; Pham, Q.V.; da Costa, D.B.; Liyanage, M. Blockchain for the metaverse: A review. Future Gener. Comput. Syst. 2023, 143, 401–419. [Google Scholar] [CrossRef]
  56. Lee, L.H.; Braud, T.; Zhou, P.; Wang, L.; Xu, D.; Lin, Z.; Kumar, A.; Bermejo, C.; Hui, P. All one needs to know about metaverse: A complete survey on technological singularity, virtual ecosystem, and research agenda. arXiv 2021, arXiv:2110.05352. [Google Scholar]
  57. Zhang, B.; Zhu, J.; Su, H. Toward the third generation artificial intelligence. Sci. China Inf. Sci. 2023, 66, 121101. [Google Scholar] [CrossRef]
  58. Saxena, A.C.; Ojha, A.; Sobti, D.; Khang, A. Artificial Intelligence (AI)-Centric Model in the Metaverse Ecosystem. In Handbook of Research on AI-Based Technologies and Applications in the Era of the Metaverse; IGI Global: Hershey, PA, USA, 2023; pp. 1–24. [Google Scholar]
  59. Duong, T.Q.; Van Huynh, D.; Khosravirad, S.R.; Sharma, V.; Dobre, O.A.; Shin, H. From digital twin to metaverse: The role of 6G ultra-reliable and low-latency communications with multi-tier computing. IEEE Wirel. Commun. 2023, 30, 140–146. [Google Scholar] [CrossRef]
  60. Seo, J.; Park, S. SBAC: Substitution cipher access control based on blockchain for protecting personal data in metaverse. Future Gener. Comput. Syst. 2024, 151, 85–97. [Google Scholar] [CrossRef]
  61. Hyun, W. Study on standardization for interoperable metaverse. In Proceedings of the 2023 25th International Conference on Advanced Communication Technology (ICACT), Pyeongchang, Republic of Korea, 19–22 February 2023; pp. 319–322. [Google Scholar]
  62. Karunarathna, S.; Wijethilaka, S.; Ranaweera, P.; Hemachandra, K.T.; Samarasinghe, T.; Liyanage, M. The role of network slicing and edge computing in the metaverse realization. IEEE Access 2023, 11, 25502–25530. [Google Scholar] [CrossRef]
  63. Sun, Z.; Zhu, M.; Shan, X.; Lee, C. Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions. Nat. Commun. 2022, 13, 5224. [Google Scholar] [CrossRef] [PubMed]
  64. Hernan, G.; Ingale, N.; Somayaji, S.; Veerubhotla, A. Virtual Reality-Based Interventions to Improve Balance in Patients with Traumatic Brain Injury: A Scoping Review. Brain Sci. 2024, 14, 429. [Google Scholar] [CrossRef]
  65. Tefertiller, C.; Ketchum, J.M.; Bartelt, P.; Peckham, M.; Hays, K. Feasibility of virtual reality and treadmill training in traumatic brain injury: A randomized controlled pilot trial. Brain Inj. 2022, 36, 898–908. [Google Scholar] [CrossRef]
  66. Ullah, H.; Manickam, S.; Obaidat, M.; Laghari, S.U.A.; Uddin, M. Exploring the potential of metaverse technology in healthcare: Applications, challenges, and future directions. IEEE Access 2023, 11, 69686–69707. [Google Scholar] [CrossRef]
  67. Sokołowska, B. Being in Virtual Reality and Its Influence on Brain Health—An Overview of Benefits, Limitations and Prospects. Brain Sci. 2024, 14, 72. [Google Scholar] [CrossRef]
  68. Devane, N.; Behn, N.; Marshall, J.; Ramachandran, A.; Wilson, S.; Hilari, K. The use of virtual reality in the rehabilitation of aphasia: A systematic review. Disabil. Rehabil. 2023, 45, 3803–3822. [Google Scholar] [CrossRef] [PubMed]
  69. Freeman, D.; Rosebrock, L.; Waite, F.; Loe, B.S.; Kabir, T.; Petit, A.; Dudley, R.; Chapman, K.; Morrison, A.; O’Regan, E.; et al. Virtual reality (VR) therapy for patients with psychosis: Satisfaction and side effects. Psychol. Med. 2023, 53, 4373–4384. [Google Scholar] [CrossRef]
  70. Suh, I.; McKinney, T.; Siu, K.C. Current perspective of metaverse application in medical education, research and patient care. Virtual Worlds 2023, 2, 115–128. [Google Scholar] [CrossRef]
  71. Spence, C. Digitally enhancing tasting experiences. Int. J. Gastron. Food Sci. 2023, 32, 100695. [Google Scholar] [CrossRef]
  72. Atanasio, M.; Sansone, F.; Conte, R.; Tonacci, A. Exploring Taste Sensation in the Metaverse: A Literature Review. In Proceedings of the 2024 IEEE Gaming, Entertainment, and Media Conference (GEM), Turin, Italy, 5–7 June 2024; pp. 1–6. [Google Scholar]
  73. Monaco, S.; Sacchi, G. Travelling the metaverse: Potential benefits and main challenges for tourism sectors and research applications. Sustainability 2023, 15, 3348. [Google Scholar] [CrossRef]
  74. Alsamhi, M.H.; Hawbani, A.; Kumar, S.; Alsamhi, S.H. Multisensory metaverse-6G: A new paradigm of commerce and education. IEEE Access 2024, 12, 75657–75677. [Google Scholar] [CrossRef]
  75. Yang, B.; Yang, S.; Lv, Z.; Wang, F.; Olofsson, T. Application of digital twins and metaverse in the field of fluid machinery pumps and fans: A review. Sensors 2022, 22, 9294. [Google Scholar] [CrossRef] [PubMed]
  76. Emami, M.; Bayat, A.; Tafazolli, R.; Quddus, A. A Survey on Haptics: Communication, Sensing and Feedback. Available online: https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=10637258 (accessed on 15 October 2024).
  77. Chengoden, R.; Victor, N.; Huynh-The, T.; Yenduri, G.; Jhaveri, R.H.; Alazab, M.; Bhattacharya, S.; Hegde, P.; Maddikunta, P.K.R.; Gadekallu, T.R. Metaverse for healthcare: A survey on potential applications, challenges and future directions. IEEE Access 2023, 11, 12765–12795. [Google Scholar] [CrossRef]
  78. Kim, M.; Oh, J.; Son, S.; Park, Y.; Kim, J.; Park, Y. Secure and privacy-preserving authentication scheme using decentralized identifier in metaverse environment. Electronics 2023, 12, 4073. [Google Scholar] [CrossRef]
  79. Yang, K.; Zhang, Z.; Youliang, T.; Ma, J. A secure authentication framework to guarantee the traceability of avatars in metaverse. IEEE Trans. Inf. Forensics Secur. 2023, 18, 3817–3832. [Google Scholar] [CrossRef]
  80. Jim, J.R.; Hosain, M.T.; Mridha, M.F.; Kabir, M.M.; Shin, J. Towards trustworthy metaverse: Advancements and challenges. IEEE Access 2023. [Google Scholar] [CrossRef]
  81. Chen, C.; Li, Y.; Wu, Z.; Mai, C.; Liu, Y.; Hu, Y.; Kang, J.; Zheng, Z. Privacy computing meets metaverse: Necessity, taxonomy and challenges. Ad Hoc Netw. 2024, 158, 103457. [Google Scholar] [CrossRef]
  82. Karaarslan, E.; Yazici Yilmaz, S. Metaverse and Decentralization. In Metaverse: Technologies, Opportunities and Threats; Springer: Berlin/Heidelberg, Germany, 2023; pp. 31–44. [Google Scholar]
  83. Lee, H.J.; Gu, H.H. Empirical research on the metaverse user experience of digital natives. Sustainability 2022, 14, 14747. [Google Scholar] [CrossRef]
  84. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. Smart manufacturing and tactile internet based on 5G in industry 4.0: Challenges, applications and new trends. Electronics 2021, 10, 3175. [Google Scholar] [CrossRef]
  85. Cheng, R.; Wu, N.; Chen, S.; Han, B. Will metaverse be nextg internet? vision, hype, and reality. IEEE Netw. 2022, 36, 197–204. [Google Scholar] [CrossRef]
  86. Qu, Q.; Chen, Y.; Li, X.; Blasch, E.; Chen, G.; Ardiles-Cruz, E. Low-cost collision avoidance in microverse for unmanned aerial vehicle delivery networks. In Proceedings of the Sensors and Systems for Space Applications XVII, National Harbor, MD, USA, 21–25 April 2024; Volume 13062, pp. 215–226. [Google Scholar]
  87. Sun, H.; Qu, Q.; Chen, Y.; Ardiles-Cruz, E.; Blasch, E. Senior Safety Monitoring in Microverse Using Fuzzy Neural Network Enhanced Action Recognition. Available online: https://www.authorea.com/users/805226/articles/1192576-senior-safety-monitoring-in-microverse-using-fuzzy-neural-network-enhanced-action-recognition (accessed on 15 October 2024).
  88. Yang, L.; Ni, S.T.; Wang, Y.; Yu, A.; Lee, J.A.; Hui, P. Interoperability of the Metaverse: A Digital Ecosystem Perspective Review. arXiv 2024, arXiv:2403.05205. [Google Scholar]
  89. Mangrulkar, R.S.; Chavan, P.V. Blockchain Essentials; Springer: Berlin/Heidelberg, Germany, 2024. [Google Scholar]
  90. Ritterbusch, G.D.; Teichmann, M.R. Defining the metaverse: A systematic literature review. IEEE Access 2023, 11, 12368–12377. [Google Scholar] [CrossRef]
  91. Blasch, E.; Pham, T.; Chong, C.Y.; Koch, W.; Leung, H.; Braines, D.; Abdelzaher, T. Machine learning/artificial intelligence for sensor data fusion–opportunities and challenges. IEEE Aerosp. Electron. Syst. Mag. 2021, 36, 80–93. [Google Scholar] [CrossRef]
  92. Lee, H.; Hwang, Y. Technology-enhanced education through VR-making and metaverse-linking to foster teacher readiness and sustainable learning. Sustainability 2022, 14, 4786. [Google Scholar] [CrossRef]
  93. Mystakidis, S.; Berki, E.; Valtanen, J.P. Deep and meaningful e-learning with social virtual reality environments in higher education: A systematic literature review. Appl. Sci. 2021, 11, 2412. [Google Scholar] [CrossRef]
  94. Childs, E.; Mohammad, F.; Stevens, L.; Burbelo, H.; Awoke, A.; Rewkowski, N.; Manocha, D. An overview of enhancing distance learning through emerging augmented and virtual reality technologies. IEEE Trans. Vis. Comput. Graph. 2023, 30, 4480–4496. [Google Scholar] [CrossRef]
  95. Kaddoura, S.; Al Husseiny, F. The rising trend of Metaverse in education: Challenges, opportunities, and ethical considerations. PeerJ Comput. Sci. 2023, 9, e1252. [Google Scholar] [CrossRef] [PubMed]
  96. Yang, L. Recommendations for metaverse governance based on technical standards. Humanit. Soc. Sci. Commun. 2023, 10, 1–10. [Google Scholar] [CrossRef]
  97. Khan, M.; Hatami, M.; Zhao, W.; Chen, Y. A novel trusted hardware-based scalable security framework for IoT edge devices. Discov. Internet Things 2024, 4, 4. [Google Scholar] [CrossRef]
  98. Yaqoob, I.; Salah, K.; Jayaraman, R.; Omar, M. Metaverse applications in smart cities: Enabling technologies, opportunities, challenges, and future directions. Internet Things 2023, 23, 100884. [Google Scholar] [CrossRef]
  99. Wijayathunga, L.; Rassau, A.; Chai, D. Challenges and solutions for autonomous ground robot scene understanding and navigation in unstructured outdoor environments: A review. Appl. Sci. 2023, 13, 9877. [Google Scholar] [CrossRef]
  100. Gill, S.S.; Wu, H.; Patros, P.; Ottaviani, C.; Arora, P.; Pujol, V.C.; Haunschild, D.; Parlikad, A.K.; Cetinkaya, O.; Lutfiyya, H.; et al. Modern computing: Vision and challenges. Telemat. Inform. Rep. 2024, 13, 100116. [Google Scholar] [CrossRef]
  101. Nica, E.; Popescu, G.H.; Poliak, M.; Kliestik, T.; Sabie, O.M. Digital twin simulation tools, spatial cognition algorithms, and multi-sensor fusion technology in sustainable urban governance networks. Mathematics 2023, 11, 1981. [Google Scholar] [CrossRef]
  102. Ounoughi, C.; Yahia, S.B. Data fusion for ITS: A systematic literature review. Inf. Fusion 2023, 89, 267–291. [Google Scholar] [CrossRef]
  103. Wang, X.; Li, K.; Chehri, A. Multi-sensor fusion technology for 3D object detection in autonomous driving: A review. IEEE Trans. Intell. Transp. Syst. 2023, 25, 1148–1165. [Google Scholar] [CrossRef]
  104. Dahan, N.A.; Al-Razgan, M.; Al-Laith, A.; Alsoufi, M.A.; Al-Asaly, M.S.; Alfakih, T. Metaverse framework: A case study on E-learning environment (ELEM). Electronics 2022, 11, 1616. [Google Scholar] [CrossRef]
  105. Cheng, S. Metaverse and immersive interaction technology. In Metaverse: Concept, Content and Context; Springer: Berlin/Heidelberg, Germany, 2023; pp. 47–81. [Google Scholar]
  106. Yao, X.; Ma, N.; Zhang, J.; Wang, K.; Yang, E.; Faccio, M. Enhancing wisdom manufacturing as industrial metaverse for industry and society 5.0. J. Intell. Manuf. 2024, 35, 235–255. [Google Scholar] [CrossRef]
  107. Xu, M.; Niyato, D.; Chen, J.; Zhang, H.; Kang, J.; Xiong, Z.; Mao, S.; Han, Z. Generative AI-empowered simulation for autonomous driving in vehicular mixed reality metaverses. IEEE J. Sel. Top. Signal Process. 2023, 17, 1064–1079. [Google Scholar] [CrossRef]
  108. Chamola, V.; Bansal, G.; Das, T.K.; Hassija, V.; Sai, S.; Wang, J.; Zeadally, S.; Hussain, A.; Yu, F.R.; Guizani, M.; et al. Beyond reality: The pivotal role of generative ai in the metaverse. IEEE Internet Things Mag. 2024, 7, 126–135. [Google Scholar] [CrossRef]
  109. Fernández-Caramés, T.M.; Fraga-Lamas, P. Forging the Industrial Metaverse-Where Industry 5.0, Augmented and Mixed Reality, IIoT, Opportunistic Edge Computing and Digital Twins Meet. arXiv 2024, arXiv:2403.11312. [Google Scholar]
  110. Behnke, I.; Austad, H. Real-time performance of industrial IoT communication technologies: A review. IEEE Internet Things J. 2023, 11, 7399–7410. [Google Scholar] [CrossRef]
  111. Yao, Y.; Chang, X.; Li, L.; Liu, J.; Mišić, J.; Mišić, V.B. DIDs-assisted secure cross-metaverse authentication scheme for MEC-enabled metaverse. In Proceedings of the ICC 2023-IEEE International Conference on Communications, Rome, Italy, 28 May–1 June 2023; pp. 6318–6323. [Google Scholar]
  112. Zhang, L.; Du, Q.; Lu, L.; Zhang, S. Overview of the integration of communications, sensing, computing, and storage as enabling technologies for the metaverse over 6G networks. Electronics 2023, 12, 3651. [Google Scholar] [CrossRef]
  113. Zhao, L.; Yang, Q.; Huang, H.; Guo, L.; Jiang, S. Intelligent wireless sensing driven metaverse: A survey. Comput. Commun. 2024, 214, 46–56. [Google Scholar] [CrossRef]
  114. Gu, X.; Yuan, Y.; Yang, J.; Li, L. AI-empowered Pose Reconstruction for Real-time Synthesis of Remote Metaverse Avatars. In Proceedings of the 2024 21st International Joint Conference on Computer Science and Software Engineering (JCSSE), Phuket, Thailand, 19–22 June 2024; pp. 86–93. [Google Scholar]
  115. Huang, F.; Chen, Y.; Wang, X.; Wang, S.; Wu, X. Spectral clustering super-resolution imaging based on multispectral camera array. IEEE Trans. Image Process. 2023, 32, 1257–1271. [Google Scholar] [CrossRef]
  116. Tang, Y.; Song, S.; Gui, S.; Chao, W.; Cheng, C.; Qin, R. Active and low-cost hyperspectral imaging for the spectral analysis of a low-light environment. Sensors 2023, 23, 1437. [Google Scholar] [CrossRef] [PubMed]
  117. Munir, A.; Aved, A.; Blasch, E. Situational awareness: Techniques, challenges, and prospects. AI 2022, 3, 55–77. [Google Scholar] [CrossRef]
  118. Tang, Q.; Liang, J.; Zhu, F. A comparative review on multi-modal sensors fusion based on deep learning. Signal Process. 2023, 213, 109165. [Google Scholar] [CrossRef]
  119. Zhao, L.; Zhou, H.; Zhu, X.; Song, X.; Li, H.; Tao, W. Lif-seg: Lidar and camera image fusion for 3d lidar semantic segmentation. IEEE Trans. Multimed. 2023, 26, 1158–1168. [Google Scholar] [CrossRef]
  120. Smigaj, M.; Agarwal, A.; Bartholomeus, H.; Decuyper, M.; Elsherif, A.; de Jonge, A.; Kooistra, L. Thermal infrared remote sensing of stress responses in forest environments: A review of developments, Challenges, and Opportunities. Curr. For. Rep. 2024, 10, 56–76. [Google Scholar] [CrossRef]
  121. Wu, G.; Chen, Z.; Dang, J. Data Fusion and Digital Modeling. In Intelligent Bridge Maintenance and Management: Emerging Digital Technologies; Springer: Berlin/Heidelberg, Germany, 2024; pp. 337–401. [Google Scholar]
  122. Huang, M.; Feng, R.; Zou, L.; Li, R.; Xie, J. Enhancing Telecooperation Through Haptic Twin for Internet of Robotic Things: Implementation and Challenges. IEEE Internet Things J. 2024, 11, 32440–32453. [Google Scholar] [CrossRef]
  123. Baidya, T.; Moh, S. Comprehensive survey on resource allocation for edge-computing-enabled metaverse. Comput. Sci. Rev. 2024, 54, 100680. [Google Scholar] [CrossRef]
  124. Zawish, M.; Dharejo, F.A.; Khowaja, S.A.; Raza, S.; Davy, S.; Dev, K.; Bellavista, P. AI and 6G into the metaverse: Fundamentals, challenges and future research trends. IEEE Open J. Commun. Soc. 2024, 5, 730–778. [Google Scholar] [CrossRef]
  125. Yin, F.; Shi, F. A comparative survey of big data computing and HPC: From a parallel programming model to a cluster architecture. Int. J. Parallel Program. 2022, 50, 27–64. [Google Scholar] [CrossRef]
  126. Hamid, N.A.W.A.; Singh, B. High-Performance Computing Based Operating Systems, Software Dependencies and IoT Integration. In High Performance Computing in Biomimetics: Modeling, Architecture and Applications; Springer: Berlin/Heidelberg, Germany, 2024; pp. 175–204. [Google Scholar]
  127. Navaux, P.O.A.; Lorenzon, A.F.; da Silva Serpa, M. Challenges in high-performance computing. J. Braz. Comput. Soc. 2023, 29, 51–62. [Google Scholar] [CrossRef]
  128. Pyzer-Knapp, E.O.; Curioni, A. Advancing biomolecular simulation through exascale HPC, AI and quantum computing. Curr. Opin. Struct. Biol. 2024, 87, 102826. [Google Scholar] [CrossRef]
  129. Thakur, S.; Jha, S.K. Cloud Computing and its Emerging Trends on Big Data Analytics. In Proceedings of the 2023 4th International Conference on Electronics and Sustainable Communication Systems (ICESC), Coimbatore, India, 6–8 July 2023; pp. 1159–1164. [Google Scholar]
  130. Karunamurthy, A.; Yuvaraj, M.; Shahithya, J.; Thenmozhi, V. Cloud Database: Empowering Scalable and Flexible Data Management. Quing Int. J. Innov. Res. Sci. Eng. 2023, 2, 1–23. [Google Scholar] [CrossRef]
  131. Samha, A.K. Strategies for efficient resource management in federated cloud environments supporting Infrastructure as a Service (IaaS). J. Eng. Res. 2024, 12, 101–114. [Google Scholar] [CrossRef]
  132. Bharany, S.; Kaur, K.; Badotra, S.; Rani, S.; Kavita; Wozniak, M.; Shafi, J.; Ijaz, M.F. Efficient middleware for the portability of paas services consuming applications among heterogeneous clouds. Sensors 2022, 22, 5013. [Google Scholar] [CrossRef]
  133. Nadeem, F. Evaluating and ranking cloud IaaS, PaaS and SaaS models based on functional and non-functional key performance indicators. IEEE Access 2022, 10, 63245–63257. [Google Scholar] [CrossRef]
  134. Alhaidari, F.; Rahman, A.; Zagrouba, R. Cloud of Things: Architecture, applications and challenges. J. Ambient Intell. Humaniz. Comput. 2023, 14, 5957–5975. [Google Scholar] [CrossRef]
  135. Chi, H.R.; de Fátima Domingues, M.; Zhu, H.; Li, C.; Kojima, K.; Radwan, A. Healthcare 5.0: In the perspective of consumer internet-of-things-based fog/cloud computing. IEEE Trans. Consum. Electron. 2023, 69, 745–755. [Google Scholar] [CrossRef]
  136. Wang, Y.; Zhu, M.; Yuan, J.; Wang, G.; Zhou, H. The intelligent prediction and assessment of financial information risk in the cloud computing model. arXiv 2024, arXiv:2404.09322. [Google Scholar] [CrossRef]
  137. Hazra, A.; Rana, P.; Adhikari, M.; Amgoth, T. Fog computing for next-generation internet of things: Fundamental, state-of-the-art and research challenges. Comput. Sci. Rev. 2023, 48, 100549. [Google Scholar] [CrossRef]
  138. Kong, X.; Wu, Y.; Wang, H.; Xia, F. Edge computing for internet of everything: A survey. IEEE Internet Things J. 2022, 9, 23472–23485. [Google Scholar] [CrossRef]
  139. Mann, Z.A. Decentralized application placement in fog computing. IEEE Trans. Parallel Distrib. Syst. 2022, 33, 3262–3273. [Google Scholar] [CrossRef]
  140. Oprea, S.V.; Bâra, A. An Edge-Fog-Cloud computing architecture for IoT and smart metering data. Peer-Netw. Appl. 2023, 16, 818–845. [Google Scholar] [CrossRef]
  141. Hazarika, A.; Rahmati, M. Towards an evolved immersive experience: Exploring 5G-and beyond-enabled ultra-low-latency communications for augmented and virtual reality. Sensors 2023, 23, 3682. [Google Scholar] [CrossRef]
  142. Sefati, S.S.; Halunga, S. Ultra-reliability and low-latency communications on the internet of things based on 5G network: Literature review, classification, and future research view. Trans. Emerg. Telecommun. Technol. 2023, 34, e4770. [Google Scholar] [CrossRef]
  143. Qadir, Z.; Le, K.N.; Saeed, N.; Munawar, H.S. Towards 6G Internet of Things: Recent advances, use cases, and open challenges. ICT Express 2023, 9, 296–312. [Google Scholar] [CrossRef]
  144. Ajmal, M.; Siddiqa, A.; Jeong, B.; Seo, J.; Kim, D. Cell-free massive multiple-input multiple-output challenges and opportunities: A survey. ICT Express 2023, 10, 194–212. [Google Scholar] [CrossRef]
  145. Fang, G.; Sun, Y.; Almutiq, M.; Zhou, W.; Zhao, Y.; Ren, Y. Distributed medical data storage mechanism based on proof of retrievability and vector commitment for metaverse services. IEEE J. Biomed. Health Inform. 2023, 1–9. [Google Scholar] [CrossRef] [PubMed]
  146. Ooi, B.C.; Chen, G.; Shou, M.Z.; Tan, K.L.; Tung, A.; Xiao, X.; Yip, J.W.L.; Zhang, B.; Zhang, M. The metaverse data deluge: What can we do about it? In Proceedings of the 2023 IEEE 39th International Conference on Data Engineering (ICDE), Anaheim, CA, USA, 3–7 April 2023; pp. 3675–3687. [Google Scholar]
  147. Chougule, S.B.; Chaudhari, B.S.; Ghorpade, S.N.; Zennaro, M. Exploring Computing Paradigms for Electric Vehicles: From Cloud to Edge Intelligence, Challenges and Future Directions. World Electr. Veh. J. 2024, 15, 39. [Google Scholar] [CrossRef]
  148. Maher, S.M.; Ebrahim, G.A.; Hosny, S.; Salah, M.M. A cache-enabled device-to-device approach based on deep learning. IEEE Access 2023, 11, 76953–76963. [Google Scholar] [CrossRef]
  149. Masood, A.; Tuan, D.Q.; Lakew, D.S.; Dao, N.N.; Cho, S. A Review on AI-Enabled Content Caching in Vehicular Edge Caching and Networks. In Proceedings of the 2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), Bali, Indonesia, 20–23 February 2023; pp. 713–717. [Google Scholar]
  150. Singh, R.; Gill, S.S. Edge AI: A survey. Internet Things Cyber-Phys. Syst. 2023, 3, 71–92. [Google Scholar] [CrossRef]
  151. Qiu, S.; Fan, Q.; Li, X.; Zhang, X.; Min, G.; Lyu, Y. OA-cache: Oracle approximation-based cache replacement at the network edge. IEEE Trans. Netw. Serv. Manag. 2023, 20, 3177–3189. [Google Scholar] [CrossRef]
  152. Sun, H.; Sun, C.; Tong, H.; Yue, Y.; Qin, X. A Machine Learning-empowered Cache Management Scheme for High-Performance SSDs. IEEE Trans. Comput. 2024, 73, 2066–2080. [Google Scholar] [CrossRef]
  153. Murala, D.K.; Panda, S.K. Metaverse: A study on immersive technologies. In Metaverse and Immersive Technologies: An Introduction to Industrial, Business and Social Applications; Wiley: Hoboken, NJ, USA, 2023; pp. 1–41. [Google Scholar]
  154. Bibri, S.E.; Jagatheesaperumal, S.K. Harnessing the potential of the metaverse and artificial intelligence for the internet of city things: Cost-effective XReality and synergistic AIoT technologies. Smart Cities 2023, 6, 2397–2429. [Google Scholar] [CrossRef]
  155. Familoni, B.T.; Onyebuchi, N.C. Augmented and virtual reality in us education: A review: Analyzing the impact, effectiveness, and future prospects of ar/vr tools in enhancing learning experiences. Int. J. Appl. Res. Soc. Sci. 2024, 6, 642–663. [Google Scholar] [CrossRef]
  156. Korkut, E.H.; Surer, E. Visualization in virtual reality: A systematic review. Virtual Real. 2023, 27, 1447–1480. [Google Scholar] [CrossRef]
  157. Yang, Y.; Zhong, L.; Li, S.; Yu, A. Research on the Perceived Quality of Virtual Reality Headsets in Human–Computer Interaction. Sensors 2023, 23, 6824. [Google Scholar] [CrossRef] [PubMed]
  158. Theodoropoulos, A.; Stavropoulou, D.; Papadopoulos, P.; Platis, N.; Lepouras, G. Developing an interactive VR CAVE for immersive shared gaming experiences. Virtual Worlds 2023, 2, 162–181. [Google Scholar] [CrossRef]
  159. Raji, M.A.; Olodo, H.B.; Oke, T.T.; Addy, W.A.; Ofodile, O.C.; Oyewole, A.T. Business strategies in virtual reality: A review of market opportunities and consumer experience. Int. J. Manag. Entrep. Res. 2024, 6, 722–736. [Google Scholar] [CrossRef]
  160. Miljkovic, I.; Shlyakhetko, O.; Fedushko, S. Real estate app development based on AI/VR technologies. Electronics 2023, 12, 707. [Google Scholar] [CrossRef]
  161. Creed, C.; Al-Kalbani, M.; Theil, A.; Sarcar, S.; Williams, I. Inclusive AR/VR: Accessibility barriers for immersive technologies. Univers. Access Inf. Soc. 2024, 23, 59–73. [Google Scholar] [CrossRef]
  162. Dargan, S.; Bansal, S.; Kumar, M.; Mittal, A.; Kumar, K. Augmented reality: A comprehensive review. Arch. Comput. Methods Eng. 2023, 30, 1057–1080. [Google Scholar] [CrossRef]
  163. Mendoza-Ramírez, C.E.; Tudon-Martinez, J.C.; Félix-Herrán, L.C.; Lozoya-Santos, J.d.J.; Vargas-Martínez, A. Augmented reality: Survey. Appl. Sci. 2023, 13, 10491. [Google Scholar] [CrossRef]
  164. Kim, D.; Chae, H.; Kim, Y.; Choi, J.; Kim, K.H.; Jo, D. Real-Time Motion Adaptation with Spatial Perception for an Augmented Reality Character. Appl. Sci. 2024, 14, 650. [Google Scholar] [CrossRef]
  165. Sayeedunnisa, S.F.; Saberi, K.H.; Mohiuddin, M.A. Augmented GPS Navigation: Enhancing the Reliability of Location-Based Services. In Proceedings of the 2023 International Conference on Advancement in Computation & Computer Technologies (InCACCT), Gharuan, India, 5–6 May 2023; pp. 565–569. [Google Scholar]
  166. Liu, D.; Huang, R.; Metwally, A.H.S.; Tlili, A.; Lin, E.F. Application of the Metaverse in Education; Springer: Berlin/Heidelberg, Germany, 2024. [Google Scholar]
  167. Vyas, S. Extended reality and edge AI for healthcare 4.0: Systematic study. In Extended Reality for Healthcare Systems; Elsevier: Amsterdam, The Netherlands, 2023; pp. 229–240. [Google Scholar]
  168. AL Hilal, N.S.H. The impact of the use of augmented reality on online purchasing behavior sustainability: The Saudi consumer as a model. Sustainability 2023, 15, 5448. [Google Scholar] [CrossRef]
  169. Tsolis, A.; Bakogianni, S.; Angelaki, C.; Alexandridis, A.A. A review of clothing components in the development of wearable textile antennas: Design and experimental procedure. Sensors 2023, 23, 3289. [Google Scholar] [CrossRef]
  170. Minopoulos, G.M.; Memos, V.A.; Stergiou, K.D.; Stergiou, C.L.; Psannis, K.E. A medical image visualization technique assisted with AI-based haptic feedback for robotic surgery and healthcare. Appl. Sci. 2023, 13, 3592. [Google Scholar] [CrossRef]
  171. Buhalis, D.; Karatay, N. Mixed reality (MR) for generation Z in cultural heritage tourism towards metaverse. In Information and Communication Technologies in Tourism 2022: Proceedings of the ENTER 2022 eTourism Conference, Tianjin, China, 11–14 January 2022; Springer: Berlin/Heidelberg, Germany, 2022; pp. 16–27. [Google Scholar]
  172. Nleya, S.M.; Velempini, M. Industrial Metaverse: A Comprehensive Review, Environmental Impact, and Challenges. Appl. Sci. 2024, 14, 5736. [Google Scholar] [CrossRef]
  173. El Koshiry, A.; Eliwa, E.; Abd El-Hafeez, T.; Shams, M.Y. Unlocking the power of blockchain in education: An overview of innovations and outcomes. Blockchain Res. Appl. 2023, 4, 100165. [Google Scholar] [CrossRef]
  174. Mitra, S. Metaverse: A potential virtual-physical ecosystem for innovative blended education and training. J. Metaverse 2023, 3, 66–72. [Google Scholar] [CrossRef]
  175. Tan, Y.W.; Low, S.E.; Chow, J.; Teo, J.; Bhojan, A. DHR+ S: Distributed hybrid rendering with realistic real-time shadows for interactive thin client metaverse and game applications. Vis. Comput. 2024, 40, 4981–4991. [Google Scholar] [CrossRef]
  176. Wang, J.; Chen, S.; Liu, Y.; Lau, R. Intelligent metaverse scene content construction. IEEE Access 2023, 11, 76222–76241. [Google Scholar] [CrossRef]
  177. Xu, M.; Niyato, D.; Wright, B.; Zhang, H.; Kang, J.; Xiong, Z.; Mao, S.; Han, Z. EPViSA: Efficient Auction Design for Real-Time Physical-Virtual Synchronization in the Human-Centric Metaverse. IEEE J. Sel. Areas Commun. 2023, 42, 694–709. [Google Scholar] [CrossRef]
  178. Wang, Y.; Su, Z.; Guo, S.; Dai, M.; Luan, T.H.; Liu, Y. A survey on digital twins: Architecture, enabling technologies, security and privacy, and future prospects. IEEE Internet Things J. 2023, 10, 14965–14987. [Google Scholar] [CrossRef]
  179. Ali, W.A.; Fanti, M.P.; Roccotelli, M.; Ranieri, L. A review of digital twin technology for electric and autonomous vehicles. Appl. Sci. 2023, 13, 5871. [Google Scholar] [CrossRef]
  180. Meijer, C.; Uh, H.W.; El Bouhaddani, S. Digital twins in healthcare: Methodological challenges and opportunities. J. Pers. Med. 2023, 13, 1522. [Google Scholar] [CrossRef]
  181. Kuru, K. Metaomnicity: Toward immersive urban metaverse cyberspaces using smart city digital twins. IEEE Access 2023, 11, 43844–43868. [Google Scholar] [CrossRef]
  182. Faliagka, E.; Christopoulou, E.; Ringas, D.; Politi, T.; Kostis, N.; Leonardos, D.; Tranoris, C.; Antonopoulos, C.P.; Denazis, S.; Voros, N. Trends in digital twin framework architectures for smart cities: A case study in smart mobility. Sensors 2024, 24, 1665. [Google Scholar] [CrossRef] [PubMed]
  183. Turab, M.; Jamil, S. A comprehensive survey of digital twins in healthcare in the era of metaverse. BioMedInformatics 2023, 3, 563–584. [Google Scholar] [CrossRef]
  184. Soori, M.; Arezoo, B.; Dastres, R. Digital twin for smart manufacturing, A review. Sustain. Manuf. Serv. Econ. 2023, 2, 100017. [Google Scholar] [CrossRef]
  185. Aggarwal, S.; Kumar, N.; Singh, A.; Aujla, G.S. BlockTwins: Blockchain Empowered Supply Chain Digital Twins in Metaverse. In Proceedings of the 2024 IEEE International Conference on Communications Workshops (ICC Workshops), Denver, CO, USA, 9–13 June 2024; pp. 1456–1461. [Google Scholar]
  186. Qu, Q.; Ogunbunmi, S.; Hatami, M.; Xu, R.; Chen, Y.; Chen, G.; Blasch, E. A digital twins enabled reputation system for microchain-based uav networks. In Proceedings of the 2023 IEEE 12th International Conference on Cloud Networking (CloudNet), Hoboken, NJ, USA, 1–3 November 2023; pp. 428–432. [Google Scholar]
  187. Sumon, R.I.; Uddin, S.M.I.; Akter, S.; Mozumder, M.A.I.; Khan, M.O.; Kim, H.C. Natural Language Processing Influence on Digital Socialization and Linguistic Interactions in the Integration of the Metaverse in Regular Social Life. Electronics 2024, 13, 1331. [Google Scholar] [CrossRef]
  188. Li, B.; Xu, T.; Li, X.; Cui, Y.; Bian, X.; Teng, S.; Ma, S.; Fan, L.; Tian, Y.; Wang, F.Y. Integrating large language models and metaverse in autonomous racing: An education-oriented perspective. IEEE Trans. Intell. Veh. 2024, 9, 59–64. [Google Scholar] [CrossRef]
  189. Far, S.B.; Rad, A.I.; Bamakan, S.M.H.; Asaar, M.R. Toward Metaverse of everything: Opportunities, challenges, and future directions of the next generation of visual/virtual communications. J. Netw. Comput. Appl. 2023, 217, 103675. [Google Scholar]
  190. Sun, Y.; Wang, H.; Chan, P.M.; Tabibi, M.; Zhang, Y.; Lu, H.; Chen, Y.; Lee, C.H.; Asadipour, A. Fictional Worlds, Real Connections: Developing Community Storytelling Social Chatbots through LLMs. arXiv 2023, arXiv:2309.11478. [Google Scholar]
  191. Partarakis, N.; Zabulis, X. A review of immersive technologies, knowledge representation, and AI for human-centered digital experiences. Electronics 2024, 13, 269. [Google Scholar] [CrossRef]
  192. Hassan, S.Z.; Salehi, P.; Røed, R.K.; Halvorsen, P.; Baugerud, G.A.; Johnson, M.S.; Lison, P.; Riegler, M.; Lamb, M.E.; Griwodz, C.; et al. Towards an AI-driven talking avatar in virtual reality for investigative interviews of children. In Proceedings of the 2nd Workshop on Games Systems, Athlone, Ireland, 14 June 2022; pp. 9–15. [Google Scholar]
  193. Li, Y.; Hashim, A.S.; Lin, Y.; Nohuddin, P.N.; Venkatachalam, K.; Ahmadian, A. AI-based Visual Speech Recognition Towards Realistic Avatars and Lip-Reading Applications in the Metaverse. Appl. Soft Comput. 2024, 164, 111906. [Google Scholar] [CrossRef]
  194. Hu, Y.H.; Yu, H.Y.; Tzeng, J.W.; Zhong, K.C. Using an avatar-based digital collaboration platform to foster ethical education for university students. Comput. Educ. 2023, 196, 104728. [Google Scholar] [CrossRef]
  195. Jones, C.L.E.; Hancock, T.; Kazandjian, B.; Voorhees, C.M. Engaging the Avatar: The effects of authenticity signals during chat-based service recoveries. J. Bus. Res. 2022, 144, 703–716. [Google Scholar] [CrossRef]
  196. Morrow, E.; Zidaru, T.; Ross, F.; Mason, C.; Patel, K.D.; Ream, M.; Stockley, R. Artificial intelligence technologies and compassion in healthcare: A systematic scoping review. Front. Psychol. 2023, 13, 971044. [Google Scholar] [CrossRef]
  197. Campitiello, L.; Beatini, V.; Di Tore, S. Non-player Character Smart in Virtual Learning Environment: Empowering Education Through Artificial Intelligence. In Artificial Intelligence with and for Learning Sciences. Past, Present, and Future Horizons, Proceedings of the First Workshop, WAILS 2024, Salerno, Italy, 18–19 January 2024; Springer: Berlin/Heidelberg, Germany, 2024; pp. 131–137. [Google Scholar]
  198. Chodvadiya, C.; Solanki, V.; Singh, K.G. Intelligent Virtual Worlds: A Survey of the Role of AI in the Metaverse. In Proceedings of the 2024 3rd International Conference for Innovation in Technology (INOCON), Bangalore, India, 1–3 March 2024; pp. 1–6. [Google Scholar]
  199. Selznick, B.S.; Goodman, M.A.; McCready, A.M.; Duran, A. Developing Relational Leaders Through Sorority Engagement: A Quantitative Approach. Innov. High. Educ. 2024, 49, 319–347. [Google Scholar] [CrossRef]
  200. Huyen, N.T. Fostering Design Thinking mindset for university students with NPCs in the metaverse. Heliyon 2024, 10, e34964. [Google Scholar] [CrossRef]
  201. Qin, H.X.; Hui, P. Empowering the metaverse with generative ai: Survey and future directions. In Proceedings of the 2023 IEEE 43rd International Conference on Distributed Computing Systems Workshops (ICDCSW), Hong Kong, China, 18–21 July 2023; pp. 85–90. [Google Scholar]
  202. Makowska, A.; Weiskirchen, R. Nasopharyngeal Carcinoma Cell Lines: Reliable Alternatives to Primary Nasopharyngeal Cells? Cells 2024, 13, 559. [Google Scholar] [CrossRef]
  203. Pretty, E.J.; Fayek, H.M.; Zambetta, F. A case for personalized non-player character companion design. Int. J. Hum.-Interact. 2024, 40, 3051–3070. [Google Scholar] [CrossRef]
  204. Uludağlı, M.Ç.; Oğuz, K. Non-player character decision-making in computer games. Artif. Intell. Rev. 2023, 56, 14159–14191. [Google Scholar] [CrossRef]
  205. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. Blockchain integration in the era of industrial metaverse. Appl. Sci. 2023, 13, 1353. [Google Scholar] [CrossRef]
  206. Cheng, Y.; Guo, Y.; Xu, M.; Hu, Q.; Yu, D.; Cheng, X. An adaptive and modular blockchain enabled architecture for a decentralized metaverse. IEEE J. Sel. Areas Commun. 2023, 42, 893–904. [Google Scholar] [CrossRef]
  207. Ogunbunmi, S.; Hatmai, M.; Xu, R.; Chen, Y.; Blasch, E.; Ardiles-Cruz, E.; Aved, A.; Chen, G. A lightweight reputation system for uav networks. In Security and Privacy in Cyber-Physical Systems and Smart Vehicles, Proceedings of the First EAI International Conference, SmartSP 2023, Chicago, IL, USA, 12–13 October 2023; Springer: Berlin/Heidelberg, Germany, 2023; pp. 114–129. [Google Scholar]
  208. Kumar, P.; Kumar, R.; Aloqaily, M.; Islam, A.N. Explainable AI and blockchain for metaverse: A security, and privacy perspective. IEEE Consum. Electron. Mag. 2023, 13, 90–97. [Google Scholar] [CrossRef]
  209. Kottursamy, K.; Sadayapillai, B.; AlZubi, A.A.; Bashir, A.K. A novel blockchain architecture with mutable block and immutable transactions for enhanced scalability. Sustain. Energy Technol. Assess. 2023, 58, 103320. [Google Scholar] [CrossRef]
  210. Torab-Miandoab, A.; Samad-Soltani, T.; Jodati, A.; Rezaei-Hachesu, P. Interoperability of heterogeneous health information systems: A systematic literature review. BMC Med. Inform. Decis. Mak. 2023, 23, 18. [Google Scholar] [CrossRef]
  211. Bianchi, M.; Bouvard, M.; Gomes, R.; Rhodes, A.; Shreeti, V. Mobile payments and interoperability: Insights from the academic literature. Inf. Econ. Policy 2023, 65, 101068. [Google Scholar] [CrossRef]
  212. Narang, N.K. Mentor’s Musings on Role of Standards, Regulations & Policies in Navigating through Metaverse and its Future Avatars. IEEE Internet Things Mag. 2023, 6, 4–11. [Google Scholar]
  213. Yu, G.; Liu, C.; Fang, T.; Jia, J.; Lin, E.; He, Y.; Fu, S.; Wang, L.; Wei, L.; Huang, Q. A survey of real-time rendering on Web3D application. Virtual Real. Intell. Hardw. 2023, 5, 379–394. [Google Scholar] [CrossRef]
  214. Hamidouche, W.; Bariah, L.; Debbah, M. Immersive Media and Massive Twinning: Advancing Toward the Metaverse. IEEE Commun. Mag. 2024, 62, 20–32. [Google Scholar] [CrossRef]
  215. Della-Bosca, D.; Grant, G.; Patterson, D.; Roberts, S. The Multiplicitous Metaverse: Purposeful Ways of Applying and Understanding eXtended Reality in Learning and Teaching Frameworks. In Augmented and Virtual Reality in the Metaverse; Springer: Berlin/Heidelberg, Germany, 2024; pp. 41–63. [Google Scholar]
  216. Westphal, C.; Hong, J.; Kang, S.G.; Chiariglione, L.; Jiang, T. Networking for the Metaverse: The Standardization Landscape. arXiv 2023, arXiv:2312.09295. [Google Scholar] [CrossRef]
  217. The World Metaverse Council. 2024. Available online: https://wmetac.com/ (accessed on 15 October 2024).
  218. Tran, K.Q.; Neeli, H.; Tsekos, N.V.; Velazco-Garcia, J.D. Immersion into 3D Biomedical Data via Holographic AR Interfaces Based on the Universal Scene Description (USD) Standard. In Proceedings of the 2023 IEEE 23rd International Conference on Bioinformatics and Bioengineering (BIBE), Virtual, 4–6 December 2023; pp. 354–358. [Google Scholar]
  219. Li, T.; Yang, C.; Yang, Q.; Lan, S.; Zhou, S.; Luo, X.; Huang, H.; Zheng, Z. Metaopera: A cross-metaverse interoperability protocol. IEEE Wirel. Commun. 2023, 30, 136–143. [Google Scholar] [CrossRef]
  220. Yoo, K.; Welden, R.; Hewett, K.; Haenlein, M. The merchants of meta: A research agenda to understand the future of retailing in the metaverse. J. Retail. 2023, 99, 173–192. [Google Scholar] [CrossRef]
  221. Liarokapis, F.; Milata, V.; Ponton, J.L.; Pelechano, N.; Zacharatos, H. XR4ED: An Extended Reality Platform for Education. IEEE Comput. Graph. Appl. 2024, 44, 79–88. [Google Scholar] [CrossRef] [PubMed]
  222. Quek, H.Y.; Sielker, F.; Akroyd, J.; Bhave, A.N.; von Richthofen, A.; Herthogs, P.; van der Laag Yamu, C.; Wan, L.; Nochta, T.; Burgess, G.; et al. The conundrum in smart city governance: Interoperability and compatibility in an ever-growing ecosystem of digital twins. Data Policy 2023, 5, e6. [Google Scholar] [CrossRef]
  223. Ball, M. The Metaverse: And How It Will Revolutionize Everything; Liveright Publishing: New York, NY, USA, 2022. [Google Scholar]
  224. Song, Y.T.; Qin, J. Metaverse and personal healthcare. Procedia Comput. Sci. 2022, 210, 189–197. [Google Scholar] [CrossRef]
  225. Albouq, S.S.; Abi Sen, A.A.; Almashf, N.; Yamin, M.; Alshanqiti, A.; Bahbouh, N.M. A survey of interoperability challenges and solutions for dealing with them in IoT environment. IEEE Access 2022, 10, 36416–36428. [Google Scholar] [CrossRef]
  226. Perri, D.; Simonetti, M.; Tasso, S.; Gervasi, O. Open metaverse with open software. In Computational Science and Its Applications—ICCSA 2023, Proceedings of the 23rd International Conference, Athens, Greece, 3–6 July 2023; Springer: Berlin/Heidelberg, Germany, 2023; pp. 583–596. [Google Scholar]
  227. Bühler, M.M.; Calzada, I.; Cane, I.; Jelinek, T.; Kapoor, A.; Mannan, M.; Mehta, S.; Mookerje, V.; Nübel, K.; Pentland, A.; et al. Unlocking the power of digital commons: Data cooperatives as a pathway for data sovereign, innovative and equitable digital communities. Digital 2023, 3, 146–171. [Google Scholar] [CrossRef]
  228. Huzaifa, M.; Desai, R.; Grayson, S.; Jiang, X.; Jing, Y.; Lee, J.; Lu, F.; Pang, Y.; Ravichandran, J.; Sinclair, F.; et al. Illixr: An open testbed to enable extended reality systems research. IEEE Micro 2022, 42, 97–106. [Google Scholar] [CrossRef]
  229. Javerliat, C.; Villenave, S.; Raimbaud, P.; Lavoué, G. Plume: Record, replay, analyze and share user behavior in 6dof xr experiences. IEEE Trans. Vis. Comput. Graph. 2024, 30, 2087–2097. [Google Scholar] [CrossRef]
  230. Memmesheimer, V.; Ebert, A. A human-centered framework for scalable extended reality spaces. In Proceedings of the International Research Training Group Conference on Physical Modeling for Virtual Manufacturing Systems and Processes; Springer: Berlin/Heidelberg, Germany, 2023; pp. 111–128. [Google Scholar]
  231. Abeywardena, I.S. OXREF: Open XR for Education Framework. Int. Rev. Res. Open Distrib. Learn. 2023, 24, 185–206. [Google Scholar] [CrossRef]
  232. Sai, S.; Goyal, D.; Chamola, V.; Sikdar, B. Consumer electronics technologies for enabling an immersive metaverse experience. IEEE Consum. Electron. Mag. 2023, 13, 16–24. [Google Scholar] [CrossRef]
  233. Rubinfeld, D. Data portability and interoperability: An EU-US comparison. Eur. J. Law Econ. 2024, 57, 163–179. [Google Scholar] [CrossRef]
  234. De Luca, V.; Gatto, C.; Liaci, S.; Corchia, L.; Chiarello, S.; Faggiano, F.; Sumerano, G.; De Paolis, L.T. Virtual reality and spatial augmented reality for social inclusion: The “Includiamoci” project. Information 2023, 14, 38. [Google Scholar] [CrossRef]
  235. Martí-Testón, A.; Muñoz, A.; Gracia, L.; Solanes, J.E. Using WebXR metaverse platforms to create touristic services and cultural promotion. Appl. Sci. 2023, 13, 8544. [Google Scholar] [CrossRef]
  236. Boutsi, A.M.; Ioannidis, C.; Verykokou, S. Multi-Resolution 3D Rendering for High-Performance Web AR. Sensors 2023, 23, 6885. [Google Scholar] [CrossRef] [PubMed]
  237. Sukaridhoto, S.; Hanifati, K.; Fajrianti, E.D.; Haz, A.L.; Hafidz, I.A.A.; Basuki, D.K.; Budiarti, R.P.N.; Wicaksono, H. Web-Based Extended Reality for Supporting Medical Education. In Proceedings of the SAI Intelligent Systems Conference, Amsterdam, The Netherlands, 7–8 September 2023; Springer: Berlin/Heidelberg, Germany, 2023; pp. 791–805. [Google Scholar]
  238. Greenway, K.; Frisone, C.; Placidi, A.; Kumar, S.; Guest, W.; Winter, S.C.; Shah, K.; Henshall, C. Using immersive technology and architectural design to assist head and neck cancer patients’ recovery from treatment: A focus group and technology acceptance study. Eur. J. Oncol. Nurs. 2023, 62, 102261. [Google Scholar] [CrossRef] [PubMed]
  239. Kaarlela, T.; Pitkäaho, T.; Pieskä, S.; Padrão, P.; Bobadilla, L.; Tikanmäki, M.; Haavisto, T.; Blanco Bataller, V.; Laivuori, N.; Luimula, M. Towards metaverse: Utilizing extended reality and digital twins to control robotic systems. Actuators 2023, 12, 219. [Google Scholar] [CrossRef]
  240. Cheng, R.; Wu, N.; Varvello, M.; Chen, S.; Han, B. Are we ready for metaverse? A measurement study of social virtual reality platforms. In Proceedings of the 22nd ACM Internet Measurement Conference, Nice, France, 25–27 October 2022; pp. 504–518. [Google Scholar]
  241. Warburton, M.; Mon-Williams, M.; Mushtaq, F.; Morehead, J.R. Measuring motion-to-photon latency for sensorimotor experiments with virtual reality systems. Behav. Res. Methods 2023, 55, 3658–3678. [Google Scholar] [CrossRef]
  242. Zhong, L.; Chen, X.; Xu, C.; Ma, Y.; Wang, M.; Zhao, Y.; Muntean, G.M. A multi-user cost-efficient crowd-assisted VR content delivery solution in 5G-and-beyond heterogeneous networks. IEEE Trans. Mob. Comput. 2022, 22, 4405–4421. [Google Scholar] [CrossRef]
  243. Krauss, D. Mean-Time-to-Cloud: Enabling a Metaverse-Ready Network. 2023. Available online: https://www.ciena.com/insights/articles/2023/mean-time-to-cloud-enabling-a-metaverse-ready-network (accessed on 15 October 2024).
  244. Liu, S.; Xu, X.; Claypool, M. A survey and taxonomy of latency compensation techniques for network computer games. ACM Comput. Surv. (CSUR) 2022, 54, 1–34. [Google Scholar] [CrossRef]
  245. Yu, H.; Shokrnezhad, M.; Taleb, T.; Li, R.; Song, J. Toward 6g-based metaverse: Supporting highly-dynamic deterministic multi-user extended reality services. IEEE Netw. 2023, 37, 30–38. [Google Scholar] [CrossRef]
  246. Chu, N.H.; Hoang, D.T.; Nguyen, D.N.; Phan, K.T.; Dutkiewicz, E.; Niyato, D.; Shu, T. Metaslicing: A novel resource allocation framework for metaverse. IEEE Trans. Mob. Comput. 2023, 23, 4145–4162. [Google Scholar] [CrossRef]
  247. Zalan, T.; Barbesino, P. Making the metaverse real. Digit. Bus. 2023, 3, 100059. [Google Scholar] [CrossRef]
  248. Sihare, S.; Khang, A. Effects of quantum technology on the metaverse. In Handbook of Research on AI-Based Technologies and Applications in the Era of the Metaverse; IGI Global: Hershey, PA, USA, 2023; pp. 174–203. [Google Scholar]
  249. Mozaffariahrar, E.; Theoleyre, F.; Menth, M. A survey of Wi-Fi 6: Technologies, advances, and challenges. Future Internet 2022, 14, 293. [Google Scholar] [CrossRef]
  250. Chen, C.; Chen, X.; Das, D.; Akhmetov, D.; Cordeiro, C. Overview and performance evaluation of Wi-Fi 7. IEEE Commun. Stand. Mag. 2022, 6, 12–18. [Google Scholar] [CrossRef]
  251. Reshef, E.; Cordeiro, C. Future directions for Wi-Fi 8 and beyond. IEEE Commun. Mag. 2022, 60, 50–55. [Google Scholar] [CrossRef]
  252. Liu, X.; Chen, T.; Dong, Y.; Mao, Z.; Gan, M.; Yang, X.; Lu, J. Wi-Fi 8: Embracing the Millimeter-Wave Era. arXiv 2023, arXiv:2309.16813. [Google Scholar]
  253. Domeke, A.; Cimoli, B.; Monroy, I.T. Integration of network slicing and machine learning into edge networks for low-latency services in 5G and beyond systems. Appl. Sci. 2022, 12, 6617. [Google Scholar] [CrossRef]
  254. Dommeti, V.S.; Dharani, M.; Shasidhar, K.; Reddy, Y.D.R.; Moorthy, T.V. Quality Enhancement with Frame-wise DLCNN using High Efficiency Video Coding in 5G Networks. Scalable Comput. Pract. Exp. 2024, 25, 1264–1275. [Google Scholar] [CrossRef]
  255. Shin, H.; Lee, J. Hardware Multi-Threaded System for High-Performance JPEG Decoding. J. Signal Process. Syst. 2024, 96, 67–79. [Google Scholar] [CrossRef]
  256. Zhou, Y.; Chen, W.; Li, L.; Li, M.; Zhang, T. Initiative, Immersive Human-Computer Interactions: Software Defined Memristive Neural Networks for Non-Linear Heterogeneous Scheme in the Internet of Brains. IEEE Internet Things J. 2023, 11, 13355–13371. [Google Scholar] [CrossRef]
  257. Song, R.; Xiao, B.; Song, Y.; Guo, S.; Yang, Y. A survey of blockchain-based schemes for data sharing and exchange. IEEE Trans. Big Data 2023, 9, 1477–1495. [Google Scholar] [CrossRef]
  258. Zhao, C.; Zhang, S.; Wang, T.; Liew, S.C. Bodyless block propagation: Tps fully scalable blockchain with pre-validation. Future Gener. Comput. Syst. 2024, 163, 107516. [Google Scholar] [CrossRef]
  259. Monem, M.; Hossain, M.T.; Alam, M.G.R.; Munir, M.S.; Rahman, M.M.; AlQahtani, S.A.; Almutlaq, S.; Hassan, M.M. A sustainable Bitcoin blockchain network through introducing dynamic block size adjustment using predictive analytics. Future Gener. Comput. Syst. 2024, 153, 12–26. [Google Scholar] [CrossRef]
  260. Zafar, S.; Bhatti, K.; Shabbir, M.; Hashmat, F.; Akbar, A.H. Integration of blockchain and Internet of Things: Challenges and solutions. Ann. Telecommun. 2022, 77, 13–32. [Google Scholar] [CrossRef]
  261. Kushwaha, S.S.; Joshi, S.; Singh, D.; Kaur, M.; Lee, H.N. Systematic review of security vulnerabilities in ethereum blockchain smart contract. IEEE Access 2022, 10, 6605–6621. [Google Scholar] [CrossRef]
  262. Lasla, N.; Al-Sahan, L.; Abdallah, M.; Younis, M. Green-PoW: An energy-efficient blockchain Proof-of-Work consensus algorithm. Comput. Netw. 2022, 214, 109118. [Google Scholar] [CrossRef]
  263. Ren, K.; Ho, N.M.; Loghin, D.; Nguyen, T.T.; Ooi, B.C.; Ta, Q.T.; Zhu, F. Interoperability in blockchain: A survey. IEEE Trans. Knowl. Data Eng. 2023, 35, 12750–12769. [Google Scholar] [CrossRef]
  264. Rafique, W.; Qadir, J. Internet of everything meets the metaverse: Bridging physical and virtual worlds with blockchain. Comput. Sci. Rev. 2024, 54, 100678. [Google Scholar] [CrossRef]
  265. Ghirmai, S.; Mebrahtom, D.; Aloqaily, M.; Guizani, M.; Debbah, M. Self-sovereign identity for trust and interoperability in the metaverse. In Proceedings of the 2022 IEEE Smartworld, Ubiquitous Intelligence & Computing, Scalable Computing & Communications, Digital Twin, Privacy Computing, Metaverse, Autonomous & Trusted Vehicles (SmartWorld/UIC/ScalCom/DigitalTwin/PriComp/Meta), Haikou, China, 15–18 December 2022; pp. 2468–2475. [Google Scholar]
  266. Aung, N.; Dhelim, S.; Chen, L.; Ning, H.; Atzori, L.; Kechadi, T. Edge-enabled metaverse: The convergence of metaverse and mobile edge computing. Tsinghua Sci. Technol. 2023, 29, 795–805. [Google Scholar] [CrossRef]
  267. Abilkaiyrkyzy, A.; Elhagry, A.; Laamarti, F.; Elsaddik, A. Metaverse key requirements and platforms survey. IEEE Access 2023, 11, 117765–117787. [Google Scholar] [CrossRef]
  268. Rawat, D.B.; El Alami, H.; Hagos, D.H. Metaverse Survey & Tutorial: Exploring Key Requirements, Technologies, Standards, Applications, Challenges, and Perspectives. arXiv 2024, arXiv:2405.04718. [Google Scholar]
  269. Chen, B.; Song, C.; Lin, B.; Xu, X.; Tang, R.; Lin, Y.; Yao, Y.; Timoney, J.; Bi, T. A cross-platform metaverse data management system. In Proceedings of the 2022 IEEE International Conference on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE), Rome, Italy, 26–28 October 2022; pp. 145–150. [Google Scholar]
  270. Jaimini, U.; Zhang, T.; Brikis, G.O.; Sheth, A. imetaversekg: Industrial metaverse knowledge graph to promote interoperability in design and engineering applications. IEEE Internet Comput. 2022, 26, 59–67. [Google Scholar] [CrossRef]
  271. Zhang, Y.; Kutscher, D.; Cui, Y. Networked Metaverse Systems: Foundations, Gaps, Research Directions. IEEE Open J. Commun. Soc. 2024, 5, 5488–5539. [Google Scholar] [CrossRef]
  272. Altundas, S.; Karaarslan, E. Cross-platform and personalized avatars in the metaverse: Ready player me case. In Digital Twin Driven Intelligent Systems and Emerging Metaverse; Springer: Berlin/Heidelberg, Germany, 2023; pp. 317–330. [Google Scholar]
  273. Korbel, J.J.; Siddiq, U.H.; Zarnekow, R. Towards virtual 3D asset price prediction based on machine learning. J. Theor. Appl. Electron. Commer. Res. 2022, 17, 924–948. [Google Scholar] [CrossRef]
  274. Golam, M.; Tuli, E.A.; Alief, R.N.; Kim, D.S.; Lee, J.M. Meta-Learning: A Digital Learning Management Framework using Blockchain for Metaverses. IEEE Access 2024, 12, 92774–92786. [Google Scholar] [CrossRef]
  275. Lin, Y.; Liu, S. The integration strategy of information system based on artificial intelligence big data technology in metaverse environment. Clust. Comput. 2024, 27, 7049–7057. [Google Scholar] [CrossRef]
  276. Deshmukh, R.; Nand, N.; Pawar, A.; Wagh, D.; Kudale, A. Video conferencing using WebRTC. In Proceedings of the 2023 International Conference on Sustainable Computing and Data Communication Systems (ICSCDS), Erode, India, 23–25 March 2023; pp. 857–864. [Google Scholar]
  277. Praschl, C.; Krauss, O. Extending 3D geometric file formats for geospatial applications. Appl. Geomat. 2024, 16, 161–180. [Google Scholar] [CrossRef]
  278. Al-Jundi, H.A.; Tanbour, E.Y. A framework for fidelity evaluation of immersive virtual reality systems. Virtual Real. 2022, 26, 1103–1122. [Google Scholar] [CrossRef]
  279. Braud, T.; Fernández, C.B.; Hui, P. Scaling-up ar: University campus as a physical-digital metaverse. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Christchurch, New Zealand, 12–16 March 2022; pp. 169–175. [Google Scholar]
  280. Allam, Z.; Sharifi, A.; Bibri, S.E.; Jones, D.S.; Krogstie, J. The metaverse as a virtual form of smart cities: Opportunities and challenges for environmental, economic, and social sustainability in urban futures. Smart Cities 2022, 5, 771–801. [Google Scholar] [CrossRef]
  281. Cai, Y.; Llorca, J.; Tulino, A.M.; Molisch, A.F. Compute-and data-intensive networks: The key to the metaverse. In Proceedings of the 2022 1st International Conference on 6G Networking (6GNet), Paris, France, 6–8 July 2022; pp. 1–8. [Google Scholar]
  282. Paul, S.G.; Saha, A.; Arefin, M.S.; Bhuiyan, T.; Biswas, A.A.; Reza, A.W.; Alotaibi, N.M.; Alyami, S.A.; Moni, M.A. A comprehensive review of green computing: Past, present, and future research. IEEE Access 2023, 11, 87445–87494. [Google Scholar] [CrossRef]
  283. Yang, S. Storytelling and user experience in the cultural metaverse. Heliyon 2023, 9, e14759. [Google Scholar] [CrossRef]
  284. Omar, K.; Fakhouri, H.; Zraqou, J.; Marx Gómez, J. Usability Heuristics for Metaverse. Computers 2024, 13, 222. [Google Scholar] [CrossRef]
  285. Winter, M.; Jackson, P.; Fallahkhair, S. Gesture Me: A Machine Learning Tool for Designers to Train Gesture Classifiers. In Proceedings of the International Conference on Computer-Human Interaction Research and Applications, Rome, Italy, 16–17 November 2023; pp. 336–352. [Google Scholar]
  286. Chamusca, I.L.; Ferreira, C.V.; Murari, T.B.; Apolinario, A.L., Jr.; Winkler, I. Towards sustainable virtual reality: Gathering design guidelines for intuitive authoring tools. Sustainability 2023, 15, 2924. [Google Scholar] [CrossRef]
  287. Ramaseri Chandra, A.N.; El Jamiy, F.; Reza, H. A systematic survey on cybersickness in virtual environments. Computers 2022, 11, 51. [Google Scholar] [CrossRef]
  288. Christopoulos, A.; Mystakidis, S.; Pellas, N.; Laakso, M.J. ARLEAN: An augmented reality learning analytics ethical framework. Computers 2021, 10, 92. [Google Scholar] [CrossRef]
  289. Dwivedi, Y.K.; Kshetri, N.; Hughes, L.; Rana, N.P.; Baabdullah, A.M.; Kar, A.K.; Koohang, A.; Ribeiro-Navarrete, S.; Belei, N.; Balakrishnan, J.; et al. Exploring the darkverse: A multi-perspective analysis of the negative societal impacts of the metaverse. Inf. Syst. Front. 2023, 25, 2071–2114. [Google Scholar] [CrossRef]
  290. Slater, M.; Gonzalez-Liencres, C.; Haggard, P.; Vinkers, C.; Gregory-Clarke, R.; Jelley, S.; Watson, Z.; Breen, G.; Schwarz, R.; Steptoe, W.; et al. The ethics of realism in virtual and augmented reality. Front. Virtual Real. 2020, 1, 512449. [Google Scholar] [CrossRef]
  291. Lu, Z.; Yu, M.; Jiang, G.; Chi, B.; Dong, Q. Prediction of motion sickness degree of stereoscopic panoramic videos based on content perception and binocular characteristics. Digit. Signal Process. 2023, 132, 103787. [Google Scholar] [CrossRef]
  292. Guo, C.; Blair, G.J.; Sehgal, M.; Sangiuliano Jimka, F.N.; Bellafard, A.; Silva, A.J.; Golshani, P.; Basso, M.A.; Blair, H.T.; Aharoni, D. Miniscope-LFOV: A large-field-of-view, single-cell-resolution, miniature microscope for wired and wire-free imaging of neural dynamics in freely behaving animals. Sci. Adv. 2023, 9, eadg3918. [Google Scholar] [CrossRef]
  293. Kim, A. Exploring the Relationship among Cybersickness, Locomotion Method, and Heart Rate Variability when Navigating a Virtual Environment. In Proceedings of the 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR), Los Angeles, CA, USA, 17–19 January 2024; pp. 215–220. [Google Scholar]
  294. Radanliev, P.; De Roure, D.; Novitzky, P.; Sluganovic, I. Accessibility and inclusiveness of new information and communication technologies for disabled users and content creators in the Metaverse. Disabil. Rehabil. Assist. Technol. 2023, 19, 1849–1863. [Google Scholar] [CrossRef]
  295. Henni, S.H.; Maurud, S.; Fuglerud, K.S.; Moen, A. The experiences, needs and barriers of people with impairments related to usability and accessibility of digital health solutions, levels of involvement in the design process and strategies for participatory and universal design: A scoping review. BMC Public Health 2022, 22, 35. [Google Scholar] [CrossRef]
  296. Inbavalli, A.; Sakthidhasan, K.; Krishna, G. Image Generation Using AI with Effective Audio Playback System. In Proceedings of the 2024 5th International Conference for Emerging Technology (INCET), Belgaum, India, 24–26 May 2024; pp. 1–13. [Google Scholar]
  297. Pari, S.N.; Ahamed, M.J.S.; Magarika, M.; Latchiyanathan, K. SLatAR-A Sign Language Translating Augmented Reality Application. In Proceedings of the 2023 12th International Conference on Advanced Computing (ICoAC), Chennai, India, 17–19 August 2023; pp. 1–8. [Google Scholar]
  298. Jones, D.; Ghasemi, S.; Gračanin, D.; Azab, M. Privacy, safety, and security in extended reality: User experience challenges for neurodiverse users. In Proceedings of the International Conference on Human-Computer Interaction, Copenhagen, Denmark, 23–28 July 2023; pp. 511–528. [Google Scholar]
  299. Cruz, M.; Oliveira, A. Unravelling Virtual Realities—Gamers’ Perceptions of the Metaverse. Electronics 2024, 13, 2491. [Google Scholar] [CrossRef]
  300. Khan, R.H.; Miah, J.; Syeed, M.M.; Uddin, M.F. Metaverse ecosystem realization for its application development. In Proceedings of the 2024 16th International Conference on Computer and Automation Engineering (ICCAE), Melbourne, Australia, 14–16 March 2024; pp. 109–113. [Google Scholar]
  301. Yasuda, A. Metaverse ethics: Exploring the social implications of the metaverse. AI Ethics 2024, 1–12. [Google Scholar] [CrossRef]
  302. Kang, G.; Koo, J.; Kim, Y.G. Security and privacy requirements for the metaverse: A metaverse applications perspective. IEEE Commun. Mag. 2023, 62, 148–154. [Google Scholar] [CrossRef]
  303. Chen, Z.; Wu, J.; Gan, W.; Qi, Z. Metaverse security and privacy: An overview. In Proceedings of the 2022 IEEE International Conference on Big Data (Big Data), Osaka, Japan, 17–20 December 2022; pp. 2950–2959. [Google Scholar]
  304. Le-Khac, N.A.; Choo, K.K.R. A Practical Hands-On Approach to Database Forensics; Springer: Berlin/Heidelberg, Germany, 2022. [Google Scholar]
  305. Mostafa, A.M.; Ezz, M.; Elbashir, M.K.; Alruily, M.; Hamouda, E.; Alsarhani, M.; Said, W. Strengthening cloud security: An innovative multi-factor multi-layer authentication framework for cloud user authentication. Appl. Sci. 2023, 13, 10871. [Google Scholar] [CrossRef]
  306. Singh, M. Securing Data in the Metaverse: What We Need to Know. In Metaverse: Technologies, Opportunities and Threats; Springer: Berlin/Heidelberg, Germany, 2023; pp. 419–432. [Google Scholar]
  307. Canbay, Y.; Utku, A.; Canbay, P. Privacy concerns and measures in metaverse: A review. In Proceedings of the 2022 15th International Conference on Information Security and Cryptography (ISCTURKEY), Ankara, Turkey, 19–20 October 2022; pp. 80–85. [Google Scholar]
  308. Azam, F.; Semwal, A.; Biradar, A. A Secured Framework for Metaverse Applications. In Proceedings of the 2023 4th IEEE Global Conference for Advancement in Technology (GCAT), Bangalore, India, 6–8 October 2023; pp. 1–6. [Google Scholar]
  309. Huang, Y.; Li, Y.J.; Cai, Z. Security and privacy in metaverse: A comprehensive survey. Big Data Min. Anal. 2023, 6, 234–247. [Google Scholar] [CrossRef]
  310. Basyoni, L.; Tabassum, A.; Shaban, K.; Elmahjub, E.; Halabi, O.; Qadir, J. Navigating Privacy Challenges in the Metaverse: A Comprehensive Examination of Current Technologies and Platforms. IEEE Internet Things Mag. 2024, 7, 144–152. [Google Scholar] [CrossRef]
  311. Parlar, T. Data privacy and security in the metaverse. In Metaverse: Technologies, Opportunities and Threats; Springer: Berlin/Heidelberg, Germany, 2023; pp. 123–133. [Google Scholar]
Figure 1. An illustration of the 7-layer metaverse architecture.
Figure 1. An illustration of the 7-layer metaverse architecture.
Futureinternet 16 00379 g001
Figure 2. Metaverse technologies.
Figure 2. Metaverse technologies.
Futureinternet 16 00379 g002
Figure 3. Real-time metaverse hierarchical system.
Figure 3. Real-time metaverse hierarchical system.
Futureinternet 16 00379 g003
Figure 4. Metaverse architecture.
Figure 4. Metaverse architecture.
Futureinternet 16 00379 g004
Figure 5. Real-time metaverse in a closed-loop system.
Figure 5. Real-time metaverse in a closed-loop system.
Futureinternet 16 00379 g005
Figure 6. Structures of computing in the network.
Figure 6. Structures of computing in the network.
Futureinternet 16 00379 g006
Figure 7. A general 5G cellular network architecture.
Figure 7. A general 5G cellular network architecture.
Futureinternet 16 00379 g007
Figure 8. Immersive metaverse technologies.
Figure 8. Immersive metaverse technologies.
Futureinternet 16 00379 g008
Figure 9. Interoperability of the metaverse.
Figure 9. Interoperability of the metaverse.
Futureinternet 16 00379 g009
Figure 10. Metaverse applications - bandwidth versus latency.
Figure 10. Metaverse applications - bandwidth versus latency.
Futureinternet 16 00379 g010
Figure 11. Security challenges associated with the metaverse.
Figure 11. Security challenges associated with the metaverse.
Futureinternet 16 00379 g011
Table 1. Progression of VR, comparing bandwidth and latency across different technologies.
Table 1. Progression of VR, comparing bandwidth and latency across different technologies.
TechnologyBandwidth RequirementLatencyResolution
Wi-Fi 6EUp to 2.4 Gbps∼20 msHD, 2K, 4K 8K with a strong signal strength and minimal interference
Wi-Fi 7Up to 46 Gbps∼10 msUp to 8K Potential to support 16K video resolution
Wi-Fi 8Up to 100 Gbps<1 msUp to 8K Potential to support 16K video resolution
5G Network1-10 Gbps1–10 msUp to 4K 8K depends on network conditions and coverage
NGCodec + 5G∼1–10 Gbps (optimized)<5 msUp to 8K
FPGA-based VR StreamingDepends on setup<1 msUp to 8K
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hatami, M.; Qu, Q.; Chen, Y.; Kholidy, H.; Blasch, E.; Ardiles-Cruz, E. A Survey of the Real-Time Metaverse: Challenges and Opportunities. Future Internet 2024, 16, 379. https://doi.org/10.3390/fi16100379

AMA Style

Hatami M, Qu Q, Chen Y, Kholidy H, Blasch E, Ardiles-Cruz E. A Survey of the Real-Time Metaverse: Challenges and Opportunities. Future Internet. 2024; 16(10):379. https://doi.org/10.3390/fi16100379

Chicago/Turabian Style

Hatami, Mohsen, Qian Qu, Yu Chen, Hisham Kholidy, Erik Blasch, and Erika Ardiles-Cruz. 2024. "A Survey of the Real-Time Metaverse: Challenges and Opportunities" Future Internet 16, no. 10: 379. https://doi.org/10.3390/fi16100379

APA Style

Hatami, M., Qu, Q., Chen, Y., Kholidy, H., Blasch, E., & Ardiles-Cruz, E. (2024). A Survey of the Real-Time Metaverse: Challenges and Opportunities. Future Internet, 16(10), 379. https://doi.org/10.3390/fi16100379

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop