Next Article in Journal
Review: A Survey on Configurations and Performance of Flow-Mode MR Valves
Next Article in Special Issue
A Stray Light Detection Model for VR Head-Mounted Display Based on Visual Perception
Previous Article in Journal
Error Correction Method of TIADC System Based on Parameter Estimation of Identification Model
Previous Article in Special Issue
Full-Body Motion Capture-Based Virtual Reality Multi-Remote Collaboration System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Smart Factory Using Virtual Reality and Online Multi-User: Towards a Metaverse for Experimental Frameworks

by
Luis Omar Alpala
1,2,*,
Darío J. Quiroga-Parra
3,
Juan Carlos Torres
1 and
Diego H. Peluffo-Ordóñez
4,5
1
Virtual Reality Laboratory, ETSIIT, Department of Computer Languages and Systems, University of Granada, c/Periodista Manuel Saucedo Aranda, s/n, 18071 Granada, Spain
2
Logistics and Transportation Engineering Career, Universidad Politécnica Estatal del Carchi, Calle Antisana y Av. Universitaria, Tulcan 040102, Ecuador
3
Administrative, Economic, and Accounting Sciences, Universidad Cooperativa de Colombia, Cra. 73 #2a-80, Cali 760003, Colombia
4
Faculty of Engineering, Corporación Universitaria Autónoma de Nariño, Pasto 520001, Colombia
5
Modeling, Simulation and Data Analysis (MSDA) Research Program, Mohammed VI Polytechnic University, Ben Guerir 47963, Morocco
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(12), 6258; https://doi.org/10.3390/app12126258
Submission received: 7 May 2022 / Revised: 10 June 2022 / Accepted: 15 June 2022 / Published: 20 June 2022
(This article belongs to the Collection Virtual and Augmented Reality Systems)

Abstract

:
Virtual reality (VR) has been brought closer to the general public over the past decade as it has become increasingly available for desktop and mobile platforms. As a result, consumer-grade VR may redefine how people learn by creating an engaging “hands-on” training experience. Today, VR applications leverage rich interactivity in a virtual environment without real-world consequences to optimize training programs in companies and educational institutions. Therefore, the main objective of this article was to improve the collaboration and communication practices in 3D virtual worlds with VR and metaverse focused on the educational and productive sector in smart factory. A key premise of our work is that the characteristics of the real environment can be replicated in a virtual world through digital twins, wherein new, configurable, innovative, and valuable ways of working and learning collaboratively can be created using avatar models. To do so, we present a proposal for the development of an experimental framework that constitutes a crucial first step in the process of formalizing collaboration in virtual environments through VR-powered metaverses. The VR system includes functional components, object-oriented configurations, advanced core, interfaces, and an online multi-user system. We present the study of the first application case of the framework with VR in a metaverse, focused on the smart factory, that shows the most relevant technologies of Industry 4.0. Functionality tests were carried out and evaluated with users through usability metrics that showed the satisfactory results of its potential educational and commercial use. Finally, the experimental results show that a commercial software framework for VR games can accelerate the development of experiments in the metaverse to connect users from different parts of the world in real time.

1. Introduction

Having colleagues in the same place performing collaborative tasks benefits teamwork substantially. Currently, people and companies turn to the Internet as a digital media to work, study, play, and learn new customized resources [1]. Likewise, many communication tools and platforms have been strategically developed to keep people connected in real time from anywhere in the world. One of these tools is the implementation of three-dimensional (3D) virtual worlds. For years, virtual worlds have evolved through computer simulation that achieves hyper-realistic graphics that blend with reality. A worth reading overview on the wide concept of virtual worlds and metaverse is presented in [2].
Computer-generated virtual world is a space where users can interact with objects in an immersive way, freely explore all spaces, socialize with audio and speech with groups of people connected in the same session as if they were in the real world [3]. Virtual reality (VR) can also display complex content or useful data innovatively, regardless of the limitations of our physical reality [4]. Such an environment has the potential to elevate collaboration, communication, and learning to a higher level of quality compared to traditional technologies such as 2D display.
Industry 4.0 has brought fundamental shifts towards collaborative communication in companies. Internal processes and cooperative tasks are performed through ongoing automation, using information-based technologies and intelligent systems (e.g., robotics, cloud computing and collaborative platforms) that help the staff have real-time communication from anywhere only using an Internet connection [5]. A part of this industrial change is consolidating technologies such as VR that draws the line between the physical and the digital world. VR applications have evolved in the last decade so that by 2021, this technology has gone beyond video games towards real-time collaborative communication, and allowing co-workers meeting in virtual 3D environments may revolutionize training programs and teamwork [6].
We believe that the virtual worlds available today already have online collaboration capabilities in education, work, and leisure, represented by 2D and 3D screen platforms. In virtual worlds, with VR, the developments are preliminary: for example, Facebook (Meta) announced in October 2021 that they intended to create virtual worlds under the name of a “metaverse” using 3D VR wherein users can interact and feel present regardless of the distance between them [7]. In general, companies try to provide 3D environments in which users experience a sense of having a real presence in the virtual world as current 2D and 3D virtual environments offer collaborative work only in basic simulations, video, audio, chat, and data. These only provide a low-immersion digital experience compared to 3D virtual environments and VR which enable the user to become totally immersed in a three-dimensional space, with high-quality graphics, connected to the Internet where the user may carry out experiences similar to those in the real world [8].
Therefore, the main objective of this article was to improve collaboration practices in 3D virtual worlds with VR focused on the education and production sectors in smart factory, starting from the fact that the characteristics of the real environment can be replicated in a virtual world through digital twin, and new, configurable, innovative, and valuable ways of working and learning collaboratively using avatar models can be created [9].
The framework was developed using the Unreal Engine 4 (UE4) game engine. We followed a methodology that incorporates video game development which is adapted to VR and metaverse development.
Finally, we present a study of the first case of the application of the framework with VR in a metaverse, focused on the smart factory that shows the technologies most relevant of Industry 4.0 (including big data and data analytics, radio frequency identification (RFID), human–machine interface (HMI), manufacturing execution system (MES), computer-aided maintenance management (CMMS), augmented reality, virtual reality and simulation, collaborative robotics, artificial intelligence, cloud computing, collaborative platforms, digital twin and cyber-physical systems) whose objective is to carry out collaborative learning for training and education in this sector.
The rest of this article is structured as follows: In Section 2, we define the state of the art for Industry 4.0, smart factory, virtual reality and metaverses. In Section 3, we introduce our VR system framework. Section 4 describes the methodology used. In Section 5, we look at the case study. In Section 6, the results and usability tests of the application are shown. We conclude by reviewing the contributions, implications, and conclusions of the research, and by highlighting future research avenues.

2. Background and Related Works

This section presents a brief overview and a review of related works on the main topics addressed in the research, ranging from Industry 4.0 and Virtual Reality to smart factory and metaverse.

2.1. Fourth Industrial Revolution

The information and communication technologies (ICTs) that emerged in the 1960s and 1970s have become, over time and space, the material, digital, and technological basis and basic infrastructure of Industry 4.0 (I4.0). This digital technology is based on computers and the Internet [10]. This phase is considered the third industrial-technological revolution. The material and conceptual basis for the first revolution was the steam engine. In the second revolution, it was the combustion engine and electricity. Industry 4.0, which emerged in Germany between 2011 and 2015, is based on the three previous revolutions, i.e., machines, automation, systematization, and data processing. Specifically, I4.0 integrates the elements of the physical context with those of cyberspace [11]. However, it is still in the process of development and is still manifesting itself as a paradigm for the productive processes of the economy, companies, and countries [12].
The literature on this subject considers the most relevant I4.0 technologies to be the Internet of Things (IoT), cyber-physical systems (CPS), big data (BD), radio frequencies (RFID), human–machine interface (HMI), manufacturing execution system (MES), augmented reality (AV), virtual reality (VR) and simulation, collaborative robotics, artificial intelligence (AI), cloud computing (CC), collaborative platforms (CP), digital twin (DT), 3D virtual environments, 3D additive manufacturing (AM), cybersecurity, wireless sensor networks (WSNs), and drones [13,14,15,16].
The application of I4.0 through virtual technologies such as simulation and VR has allowed to explore this field complex applications of automation and information that can be integrated into companies. In [17], an operator training system was proposed based on I4.0 and VR, whilst in [18], an I4.0 laboratory was designed based on virtual systems and computer simulation, giving an approximation to the operation of the intelligent factory through virtual technologies and a prototype. Moreover, in [19], the authors present the technological aspects of VR to help VR developers create industrial environments based on I4.0 to simulate real machine and process behaviors by means of computer simulation.

2.2. Smart Factory

The concept of smart factory (SF) emerged in the late 1980s with artificial intelligence (AI) as a manufacturing tool [20]. As an advanced intelligent application, SF facilitates the fast and dynamic manufacturing of products in a stable manner that easily responds to demand for customized products, and optimizing time, supply, and production networks [21].
It is known that, in the future, the digital transformation will be essential for competitiveness in companies, and after the global pandemic of last two years, there have been changes in supply chains due to manpower shortages, which has meant that companies have continued to compete by adopting technologies and smart factory solutions such as artificial intelligence, autonomous mobile robots, cybersecurity and control centers, in addition to the large amount of data generated by companies and the intersection with humans which will lead to a business transformation.
Another key technology within the smart factory is that of autonomous mobile robots (AMRs) which, by the automation of repetitive activities, improve the production efficiency of the factory; it is worth noting that these technologies for the coming years will not replace the significant labor that humans have within the factory [22].
Among the applications that some authors have made in virtual worlds with the SF, those of [23] which perform an application of productive systems for training using VR are notable, as the research is especially focused on uni-user training. In [24], they performed a laboratory specialized in SF using VR in order to learn human–machine concepts. Additionally, in [25], they studied the evaluation of smart products within SF and using VR to visualize features and functionality.

2.3. Virtual Reality and Virtual Worlds

Virtual reality (VR) is a non-real digital environment artificially designed and generated in a virtual interactive context. It is designed to simulate real-life, and its two essential characteristics are immersion and presence [26]. Immersion is understood as the level of objective sensory fidelity provided by a VR system. Presence is what each user experiences as a product of being immersed in the digital environment [27,28]. Currently, VR technologies have gained attention, and several applications have been developed due to the availability of immersive platforms and new technologies related to this field [29].
In the literature, VR is being applied to different fields such as medicine, engineering, architecture, design, etc., and developers are increasingly achieving new emerging features to integrate within the VR system, although for the most part, only as first approximations [30]. Within Industry 4.0 and smart factory, it is still too premature for mass and widespread application; however, authors such as those in [31,32], showed how this field can have a promising development for the coming years, especially for manufacturing and education.

2.4. Metaverses

The term metaverse or metauniverse originated from the novel Snow Crash by Neal Stephenson in 1992 and the acronym meta, meaning beyond and the universe [33]. This term is conceived by multiple three-dimensional virtual spaces linked to an observed universe and can also refer to the multidimensional experiences on the Internet, on the spectrum of web 2.0, 3D technologies, virtual reality, and augmented reality [34]. Metaverses are considered environments wherein humans can interact as avatars in a metaphor of the real world, without limitations, from economic and social perspectives. This Internet iteration is performed through software in cyberspace [35].
In the mid-1970s, the first games characterized by text adventures in virtual worlds such as MUDs appeared. With the Internet, virtual digital worlds were created to live other lives. In 1996, the first versions appeared, such as the little animal Tamagotchis, which had a digital virtual life and required care to survive [36]. The pioneer which elaborated this concept was the Acceleration Studies Foundation (ASF), which gained further prominence when NVIDIA announced its Omniverse real-time simulation platform for 3D production in 2020 as a next-generation alternative. Likewise, Roblox on the Fortnite gaming platform exposed Dynamite’s choreography video while Black Pink placed the Ice Cream 3D avatar video on Zepeto [37] on the market.
Recently, the conglomerate Meta, formerly known as Facebook, announced in 2021 the creation of a large-scale metaverse with VR under one of its “Horizon Worlds” platforms, to connect its users in meetings, training, and virtual tours through Internet connection [7].
The application of the metaverse is already being carried out as highlighted by works in the literature, such as in [38], wherein the metaverse was applied together with VR in teaching; in [39], wherein the authors addressed the collaboration of personnel in 3D virtual environments; and in [4], wherein a mixed-reality approach in communication spaces was explored. As for framework proposals, VR experiences in VR and metaverse were introduced in [40,41,42,43].

2.5. Summary

As outlined throughout this literature review above, the metaverse is increasingly consolidating its position as a large-scale industrial workspace. An appealing, expected outcome is that companies will be able to produce digital copies of all their processes (just as a “digital twin”), which can be tested in the metaverse before being physically implemented. Thus, the design can be corrected for improvements before construction begins, significantly saving both time and resources.
One of the key aspects that makes the development of the metaverse feasible is the immersive quality of VR, which allows users to interact with different objects and scenarios as accurately as they would in the real world. Therefore, the implementation of the metaverse in activities of daily life will be closely linked to the development and sophistication of VR devices.
Finally, the scientific and technical literature unveils diverse and numerous ways of authors are addressing the field of VR-powered metaverses from different sectors. It is expected that in the coming years, this technology will be progressively extended to all sectors and will be well and widely received.

3. The Unreal Engine Experiment Framework

This section describes the development of a framework for a VR system as a proposal for creating online multi-user 3D virtual environment projects. The VR system has a predetermined folder structure that can be the basis of any project. Each new element of the experience can be unambiguously integrated into the structure to keep even large projects tidy and easy to navigate. The Unreal Engine 4 (UE4) games engine in version 4.26 was used to develop the framework. UE4 is a video game development software that includes modules for rendering 2D or 3D images, sound systems, script action, and artificial intelligence in real-time.
Currently, the most popular graphics development engines on the market are Unity3D and (UE4) [44]. UE4 has powerful development potential in terms of official tools and plugins.
The graphic quality in virtual environments differentiates this software from other graphics engines by being very real through advanced rendering in real time. It allows to design and develop the works in quality of 3A games, and within its platform, it presents a support for VR [45]. Figure 1 shows the important sections that have been integrated into the framework with the UE4 graphics engine.
We then describe the VR system according to Figure 1 consisting of a series of components that allow virtual reality to work properly for immersive experiences.

3.1. Basics

  • Game classes—The VR system implements basic functions to make it work, including level transitions, pawn navigation, settings, and other essential data. Some of the most important functions of the game classes are: game mode declares all game classes except the game instance. Game instance has the unique ability to maintain its state even after a level change which makes it useful for storing persistent data. Therefore, it is the best place to store language, graphics, or audio settings. Additionally, the game instance stores the data assets of the current level. Player controller generates, owns, and navigates the pawn and handles the level transition logic and the pause. Player stats manage and provides information about each player and is therefore especially useful for Multi-User.
  • Level setup—Each level of the framework is encapsulated in a general map that consolidates all the maps and the load of the level. In addition, the framework relies on several essential actors at all levels to function correctly. These must be placed and configured at each new level.
  • Navigation—The framework’s navigation elements are connected to the teleportation system, which is very important for the VR environment since normal movement in VR is prone to cause motion sickness. Teleportation navigation elements primarily serve to restrict the areas to which a player can teleport. In VR, teleport is used, which is a flat actor with a teleport, and component with possible interaction methods are selected.
  • Changing levels—The framework provides a series of mechanisms to load or change levels without problems. For this function, we find a fully configured intro-level which contains: intro screen, player position, map info actor, and sky sphere. The transition of the object is automatically created and filled with the content specified in the level data asset, and it is opened every time a new level is loaded. We then describe the VR system according to Figure 1 as consisting of a series of components that allow virtual reality to work properly for immersive experiences.

3.2. Controllers

Each controller consists of a base motion controller and motion components. The base motion controller acts primarily as a box for the motion components.
Additionally, the motion components can be connected to each motion controller independently of each other, as components to the actor classes, each supporting the highlight, select, and grab functionalities. However, depending on the type of controller, these functionalities are differently implemented.
Table 1 shows the controller styles with functionalities according to the template shown in Figure 1.

3.3. The Component System

Components can be attached to actor classes and instances of actor classes that provide the desired functionalities. Without building a complicated class hierarchy, components can be freely interchanged and allow actor customization. All components that are created for the application fall into one of the following categories according to their main functionality:
  • Interaction components—Manage the interaction between the player and the actors in an application, including selection, capture, and others.
  • State components—Apply initiate, and reproduce state changes for the owning actor.
  • Snapping components: Cover all the requirements necessary to allow actors to snap into place after being released by a player.
  • Multi-user components—Consolidate a set of useful multi-user management components.
  • UI components—Cover the design, visualization, and content of the user interface elements.
  • Pawn components—They provide the pawn with functions such as controls.
  • Miscellaneous components—They comprise a set of useful components that run in the background to enable advanced functionality.

3.4. Environments

The framework is based on a rather intricate pawn class hierarchy that provides custom pawns for the different environments supported by the framework, including desktop, VR, and mobile. The most basic pawn is pawn base, which is the parent class for all pawns.
The VR pawn is the most unique of the pawn classes, as the virtual reality environment requires very different characteristics than the other environments. When starting the application, the player controller checks the experience environment and chooses the pawn specified in the information level file for this environment. The following criteria are used:
  • VR—This pawn is automatically chosen when an HMD is present at the beginning of the experience and generates motion controllers to move and interact with the virtual reality experience.
  • Desktop—The desktop pawn serves as the default environment and provides functionalities to move and interact with the experience using the mouse and keyboard.
  • Mobile—The mobile pawn is automatically chosen when the operating system is IOS or Android and provides functionalities to move and interact with the experience through the touch screen.

3.5. Multi-User

Creating a multi-user application is an intricate task and includes many issues outside the framework’s scope. Therefore, setting up an application in online mode includes the replication and ownership of objects and appropriate interfaces to configure the multi-user mode, and create and join sessions. The multi-user functionality includes:
  • The host and server—Two external plugins were used for the multi-user functionality, which are integrated into the core programming: EOS core plugin, used to present the online services (EOS) of the project. Vivox Core Plugin connects the project to the vivox voice chat system, so that it can be used with several users connected in real time.
  • Replication—All actors whose status, position, or other properties are relevant in the multi-user must be replicated to ensure synchronization between clients.
  • Preparations and configuration—The external plugins must be configured and registered in the Epic Games developer portal through the following steps for the application to work online: (creating an Epic Games developer account, creating a product, setting up Epic account services, P2P setup, checking your product details). The project must be configured to correctly create a multi-user application using the framework, plugins, and the epic games online services add-on.
  • Testing—If the correct settings were followed, then we should be ready to test the Multi-User with different users anywhere in the world. When one starts in multi-user mode, there are the following possibilities:
    • Organize a session—The session is automatically created and added to the session list.
    • Refresh session list—Refresh the session list if a session is hosted and not displayed.
    • Join a session—The player pawn is automatically generated in one of the starting positions for a created session.
The prerequisites are as follows:
  • All clients must be using the same engine build.
  • All computers and HMD must be on same LAN so that local connections are possible and so that it is possible to connect from anywhere in the world without any restriction other than access to the Internet.
  • Start with exactly same content (typically synced from source control).

3.6. User Interfaces and Data

The framework provides several user interfaces to choose the display of information, and many possibilities for the user to interact with the experience, including a widget-based user-interface such as the Hud, information window, or palettes described below:
  • Widgets—Widgets represent each visual part of the application with images, text, buttons, interactions, and customization adapted to the virtual scenarios.
  • Data—Data assets are a pre-established set of variables used to store, organize, and access various information. As a result, these occupy a similar role to data tables with the advantage of being more intuitive to manage. The framework provides the following types:
    d.
    Data tables—The data tables provided for this purpose are called I18n data tables, which are separated into rows. Each row is assigned a key that, along with the current language setting, unambiguously points to a cell in the data table that provides the text that ultimately displays the widget.
    e.
    Structs—The framework makes use of this concept to standardize translatable text, communication between components, and other functions. Many structures are only used internally and are automatically generated. However, others are used especially in component configuration.
    f.
    Enums—Enumerations provide a modifiable list of keys that can change functions or be displayed in various parts of the application. Most enumerations are best explained in their inherent environment, such as the component or user interface or another element in which they are used.

4. Method

This section presents the methodology used to develop the VR system proposed in the framework. The method is composed of a methodology adapted from video game development and usability testing.

4.1. VR Development Methodology

We propose a methodology based on [46] to design and develop VR systems. The methodology is adapted in the aspect to be able to work specifically with VR and metaverses since the methodologies proposed above only focus on the development of video games for 2D screens. Figure 2 depicts the stages of our methodology.
The phases of the methodology are described below:
  • Phase 1—Concept. In this phase, we propose the project’s initial idea. All the supported planning of the Game Design Document (GDD) is performed to record the technical highlights of the project.
  • Phase 2—Pre-production. We identify the requirements and resources needed to start the project and plan for specifications and monitoring. Here, we also begin the development of the concept, previous sketches, and gameplay design. The GDD document is constructed and drafted to contain the most relevant aspects.
  • Phase 3—Production. In this phase, the development of the framework for the creation of the VR system is carried out. The time and resources needed for the development will be considered depending on the size of the project and specific needs according to the sector.
  • Phase 4—Post-production. The testing and operation of the application is performed in different versions, alpha, beta, and finished project. In the case of metaverse testing, the multi-user is tested with Internet connection from different locations.

4.2. Usability Testing Methodology

The usability tests are used to validate the operation of the application with users. In this stage, different tests are presented (Alpha Version, Beta, and Final Project) according to phase 4 (post-production). Therefore, we need to use a methodology for usability testing with users once the project’s versions have concluded.
For this purpose, the used methodology—adapted from [47]—consists of three stages, as seen in see Figure 3. First, during planning, we define the objective of the test, the user profile, location, equipment, and definition of metrics. Then, in test development, we prepare the resources for the test and perform the test in real-time. The test results consisting of the user experience are then analyzed according to the playability metrics. Finally, we draw some conclusions and propose some strategies for improvement.

5. Case Study

5.1. Digital Factory Metaverse

For the project, namely the application of smart factory production plants using virtual reality and online Multi-User, an entirely new approach to the planning and design of production plants was followed using innovative technologies such as VR and the application of disruptive Industry 4.0 technologies. During the project, the first customized design guidelines and standards were applied based on hyper-realistic virtual worlds with high-quality graphics and realistic practice scenarios that enhanced the flexibility of planning, design, and test runs for the training and education of personnel in real-time using the 3D-metaverse virtual environment system.
A manufacturing plant was taken as a case study under the name of “Digital factory metaverse”. Figure 4 gathers some views. The plant design involves the production, raw material, finished product, and complementary areas such as the administration, services, lockers and bathrooms, green areas, among others. Furthermore, the project was designed to be executed in online multi-user mode where groups of users can make visits and practice in the production environment offered by the plant according to reality. This project fully integrates the VR framework proposed in this research.

5.2. Experimental Design

5.2.1. Participants

The study group consisted of 9 women and 11 men (mean age: 23 years; standard deviation: 1.2 years) with no physical or visual health problems. In total, 20 participants performed the test according to usability metrics. They were recruited through a university. All participants received an incentive to encourage their participation.
To perform the usability tests, the participants were divided into three groups according to their level of experience in the use and management of VR devices and virtual worlds.
The groups had the following participants:
  • Group 1—eight participants, none of which had used VR devices before.
  • Group 2—six participants with previous experience of video games and virtual worlds although not with VR.
  • Group 3—six participants which had some previous experience with virtual world management and the use of VR equipment that they had used before.

5.2.2. Apparatus

The VR environment was set up using the Meta Quest 2 and Oculus Rift S y Rift VR headsets and computers with specs (AMD Ryzen 7-5800H Octacore 5000 series CPU and Nvidia GeForce RTX3060 (6GB-GDDR6) GPU, RAM memory: 16GB (8GB DDR4-3200 SO-DIMM *2)).
Oculus Quest 2 has an average FOV of 90°, and the screen they mount is an IPS LCD with a resolution of 1832 × 1929 pixels per eye and a maximum screen refresh of 90 Hz. They also have two 6DOF touch controllers with 6 degrees of freedom of movement, thanks to which they can be tracked by the front cameras that incorporate the viewfinder.

5.2.3. Procedure

According to the methodology described in Figure 3, the outline of the data obtained from the usability testing methodology, from planning to test results, is summarized in Table 2.
In Section 6.3, we present the test results and conclusions of the VR Usability Testing applied, including: metrics definition, functionality and playability, improvement strategy, metrics evaluation and conclusions, among other parameters.

6. Results

This section presents the application of the VR system in a production plant with a smart factory approach, for which a validation of the framework and methodology of development of VR applications was made. In addition, the application was validated as a metaverse creation in online multi-user mode with a group of users in team collaboration, the use of avatars, and usability through metrics. The most important items in this section are described in detail below.

6.1. Effective Team Collaboration in the 3D Virtual Environment

We highlight the team collaboration through the use of the 3D virtual environment with Internet connection to support decision—making in own activities that have been developed to practice and train immersively. Within the concept of the metaverse, the developed framework considers that team collaboration must be carried out effectively both individually and in groups [48].

6.1.1. 3D Visual Environment/Metaverse

The 3D virtual environment developed in which VR is involved supports team collaboration, and the possibility of integrating different applications in the interaction such as showing videos as well as audio and visual presentations also offers the ability to manipulate the 3D design to make custom configurations [49].
According to [48] in the 3D virtual environment, three capabilities were identified that could affect team collaboration: presence, realism, and interactivity; for our research, we have applied them as follows:
  • Presence—The developed environment presented in the case study can stimulate users’ immersion because it offers an immersive experience in each of the senses through the perception generated by the brain. In addition, it can stimulate participation in each of the participants with visual cues, sounds, texts and relevant simulations that could help them process the information in real time that is given in the virtual scenario.
  • Realism—We focused on a virtual environment that offers a realism of high quality in textures, which allows users to believe that the virtual environment is real, which is why, in the development process, the UE4 graphic engine has integrated images, assets, and components that allow navigation in which the hyper-realistic aspects of the objects are experienced according to real life.
  • Interactivity—The interactivity present in the application with VR allows the ability to move and navigate with controls through the entire virtual environment created in real time.

6.1.2. Multi-User Mode of the Metaverse Application

The process that was carried out to use the content of the metaverse created on a small scale in online multi-user mode (see Figure 5a) allowed the application to run correctly through an Epic Games server/cloud for UE4 when online sessions with users had started. Figure 5b shows the avatar-based collaboration model that was implemented in the developed virtual environment. Each of the avatars can be adapted to the situation of the virtual environment; for this, we can change the controls (laser, hands, radial menu).

6.2. Small-Scale Metaverse Application: Smart Factory

The application was developed under the framework of VR systems and the methodology proposed in this research. The case study was developed under a smart factory approach to study the integration of the leading technologies of Industry 4.0 in a virtual factory.
For this purpose, we designed the main elements that integrate a factory, such as facilities, machinery and equipment, main production areas, warehouses, green areas, parking, and other components that make up the whole of the factory’s operation. We need to carry out a technical and engineering study to integrate the smart factory using Industry 4.0 technologies [37]. Thus, we can scale it in a virtual factory with VR. Table 3 describes 12 Industry 4.0 technologies integrated into the virtual factory.
The technologies described in Table 3 were jointly integrated into the application of the virtual factory case study, and some of these technologies, including digital twin, artificial intelligence, cloud computing virtual reality, and simulation, were immersed in processes, whilst other technologies, such as big data, data analytics, collaborative platforms, and manufacturing execution system, were integrated using visual representation, content, and management adapted to the operation of these technologies according to reality.
The other technologies were integrated into the processes of the production plant according to the needs of the production process and work according to the machinery and area of the plant. Figure 6 graphically shows the application of the virtual factory with VR and its operation according to the framework and technologies of Industry 4.0 described in Table 3.

6.3. Usability Testing

The usability tests based on user experience were conducted to discover what problems may or may not be present by observing how a group of users use the VR application to propose improvements and solutions [50]. We define the metrics according to the situation to perform the usability tests and obtain the results of use of the VR application with the participants.

6.3.1. Metrics Definition

The research proposed to evaluate four categories of metrics for the usability testing of the VR system: 1. user experience; 2. preparation and control of devices; 3. multi-user; and 4. immersion effects, and the time of use that the participants spent during the test were taken into account. Table 4 details the categories and the items used in each one.
Metric 4. Immersion effects can be evaluated at four levels: none, slight, medium, and strong.

6.3.2. Functionality and Playability

The online multi-user VR system was successfully tested with groups of users simultaneously connected to test the functionality and playability of the application. Figure 7a shows the startup screen of the VR development in alpha version, and all participants used the application under the same conditions. Figure 7b shows several participants testing the application simultaneously in the laboratory.
In Table 4, the main items evaluated for user experience were: learning, help, application functionality, and comparison of the VR application with other platform-forms, application graphics, navigation and interaction, and application content with respect to the smart factory. Figure 8a shows the results by groups for user experience.

6.3.3. Proposed Improvement

In a subsequent analysis of all participants meeting as a group, improvement proposals were made, for which seven proposals were defined that could be concluded to be of high levels after the usability tests with the participants:
  • P1—Compared to traditional virtual technology, workgroup users using VR in the virtual world may experience a higher level of (a) presence, (b) realism, and (c) interactivity.
  • P2—After using VR, users also experience a higher level of (a) social presence and (b) control over their self-expression with their avatars.
  • P3—Users of the virtual world have a higher level of information processing.
  • P4—Users of the virtual world have a higher level of communication support at the event.
  • P5—VR virtual environments developed with a graphics engine such as UE4 demonstrate a high level of realism compared to traditional virtual scenarios.
  • P6—The 3D virtual world can function perfectly as a small-scale metaverse. Users mostly experienced near-realistic training and training efficiency compared to a real environment.
  • P7—VR technology and virtual worlds can help institutions and companies save costs related to transportation logistics, purchase supplies and materials, and risk prevention in hazardous environments.
Although most of the tests carried out were satisfactory for the users, it is worth highlighting some proposals for improvement for a new version of the VR application:
  • P1—VR devices are critical in their use; for the case of students who tested with the Oculus rift touch helmets, the experience after half an hour of use resulted in dizziness, whereas for the case of users with Meta Quest 2 helmets, there was no dizziness, which is mainly due to the resolution of the lenses and the integration of sensors to detect movements.
  • P2—The avatars for the application version are preliminary, so several users said they wanted to customize their avatars.
  • P3—The application’s functionality may be limited in terms of graphics and playability for low-end computers.
  • P4—Although the tests were conducted in the laboratory with the vast majority of users, it is expected that users will be able to access the VR equipment from their homes in the future.
  • P5—The navigation and interaction with the content worked preliminarily in the first use. For the case of new students who started with the VR world, it was difficult to become accustomed with the controls and interactions within the application. However, users who had already tried VR or video games learned very fast and were very satisfactory during the tests. It is expected that users will be able to get used to it without any inconvenience and with more practice.

6.3.4. Evaluation of Metrics

We evaluated the metrics according to Table 4 using an online questionnaire (available at: https://forms.gle/o2FFLjrBwJ9EBtog7—accessed on 31 May 2022). The questionnaire was designed to evaluate conceptual, qualitative, descriptive, and exploratory aspects, rather than analytical or correlational statistical parameters.
A summary of the main results of the metrics is shown in Figure 8a, whilst the results obtained from usability are shown according to the level of satisfaction experienced by the user during practice for each of the metrics. The evaluation was carried out with questions under the four applicable parameters of the metrics for this study, namely (See Table 4): 1. user experience; 2. VR preparation; 3. multi-user; and 4. immersion effects. Figure 8b shows the average time per group that the participants spent using the VR devices.
The measurement scale used is in the range of (0–1) in order to test the behavior of each user by groups, and with this, since participants make use of both the devices and VR application, it can be concluded that possible improvements and relevant aspects can be integrated into a new version of the development of production plants with a focus on smart factory and metaverse.
The most relevant results obtained from the four metric parameters used in a descriptive and exploratory qualitative study are detailed in Table 5.

6.4. Discussion

VR and 3D environment technologies are still not widely known and applied in real contexts. Nonetheless, a number of works have shown their important application in the educational, training, and industrial areas.
In the aim of highlighting the versatility and importance of the VR-powered metaverse, this research developed an approach for team collaboration and information processing using 3D and VR virtual environment tools. Based on this first approach of creating small-scale metaverses, developers in this field can continue to design and develop new features to enhance team collaboration using VR. The ways of working with virtual environments that have been accepted to date can be changed to immersive 3D virtual environments, which are expected to allow people to have a greater degree of interaction and communication without the need to be present in the same place, allowing users from anywhere in the world to connect.
Metaverse developers can focus their research on new and unique multimedia capabilities in 3D virtual environments with VR, as well as ensuring knowledge sharing and collaboration for education, services, meetings, sales and purchasing and, in general, all the productive sector in which more than one person can interact, instead of continuing to seek the simulation of interaction on virtual platforms [2]. The challenge to the practical potential, as presented in this paper, could be successfully applied with any type of user or by companies with metaverses based on real experiences without incurring costs or risks due to lack of training.
The results of this work as first evidence of application show the potential of these VR and 3D technologies in multiple contexts. Therefore, it is important to advance in new practical work in this field, which can continue to show the empirical certainties of both the level of acceptability and the application in real environments, with the caution that the dynamics of technological progress in its application.

7. Contribution to Research

Our VR system framework model provides a practical basis for empirical research in computer graphics and the VR computing of computer-generated 3D virtual models. To date, research papers focused on the subject of study [40,41,42,43] have provided frameworks in areas other than the one studied of theoretical content, of team collaboration with avatars and design of virtual worlds. These virtual worlds may not be specifically adapted to be applied in different areas or may not be relevant in some cases to particular areas.
Therefore, through our proposed framework model, we can discuss the capabilities that the virtual environment with VR contributes to a preliminary application of a small-scale metaverse by building on previous models of collaborative work mediated by the simulation of 3D virtual worlds by a computer. We specifically show the capabilities that a metaverse of an industrial plant with a smart factory can support in collaborative information processing, since the users who tested the application experienced real-world sensations within the virtual world, were able to grasp objects, interact with their team and above all experience teaching–learning on the most relevant points of the project through a multi-user connection from different sites of a real-time location. The future research of our framework in this topic can empirically test new applications for the productive sector and education, which will lead to deepen and put into practice models of virtual worlds focused on the user, and with more experience of realism in training and learning situations. Upon the basis of this work, together with a previous study [52] devoted to introducing the so-named modular design for production plants, further research and development works are to be undergone with the aim of creating VR-based metaverse settings for smart factory modular design.

8. Practical Implications

This research addressed team collaboration and information processing through the use of tools from 3D virtual environments and VR. Based on this first approach of creating small-scale metaverses, developers in this field can continue to design and develop new features to enhance team collaboration using VR. The ways of working with virtual environments accepted until today can change by 3D virtual environments with immersion, and with this, it is expected that people have a greater degree of interaction and communication without the need to be present in the same place, allowing to connect users from anywhere in the world.
Metaverse developers can focus on research with new multimedia capabilities that are unique in 3D virtual environments with VR, whilst also ensuring knowledge sharing and collaboration for education, services, meetings, sales, and purchasing, and in general, the entire productive sector in which more than one person can interact, instead of continuing to seek the simulation of interaction on virtual platforms. The challenge to practical potential, as presented in this document, could be successfully applied with any type of user, or companies with metaverses based on real experiences without incurring costs or risks due to lack of training.

9. Conclusions

In this article, we present our argument for supporting an effective team collaboration framework and case study model in a VR 3D virtual environment by creating a small-scale metaverse. This argument puts forward the premise that 3D virtual environments with VR present unique immersive features for team collaboration compared to traditional communication technologies.
First person presence in the 3D environment with VR can immerse the user into living the experience in a real way, and with this, the participants can have a greater degree of information processing in the tasks that are performed during the practice. When the task consists of visual or spatial components, 3D realism can lead the user to think as if they were in a real world, with which tasks can be performed with a greater degree of learning.
Avatar-based interaction provides the function of facilitating team communication and collaboration. This gives the feeling of being together, which motivates the group so that they can cooperate and perform activities in real time. Particularly, Industry 4.0 under the different disruptive technologies, can be digitally represented through virtual worlds. This VR feature can allow simulation studies that allow staff training before doing it in the real world.
In this sense, the creation of metaverses is a low-cost solution for companies and educational institutions. It saves logistical costs of transportation, the purchase of materials and supplies, plus users can perform training practices as many times as necessary before going to into real practice.

Author Contributions

Conceptualization, L.O.A. and D.J.Q.-P.; methodology, L.O.A. and J.C.T.; software, L.O.A. and J.C.T.; validation, L.O.A. and D.J.Q.-P.; resources, L.O.A. and D.H.P.-O.; data curation, L.O.A. and D.J.Q.-P. writing—original draft preparation, L.O.A. and D.H.P.-O.; writing—review and editing, D.H.P.-O. and J.C.T.; supervision, D.H.P.-O. and J.C.T.; funding acquisition, D.J.Q.-P. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Universidad Cooperativa de Colombia—Cali, Colombia. (INV2788).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are provided on request.

Acknowledgments

Authors thank the valuable support given by the SDAS Research Group (https://sdas-group.com/, accessed on 4 May 2022).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Brookes, J.; Warburton, M.; Alghadier, M.; Mon-Williams, M.; Mushtaq, F. Studying human behavior with virtual reality: The Unity Experiment Framework. Behav. Res. 2020, 52, 455–463. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Park, S.M. and Kim, Y.G. A Metaverse: Taxonomy, Components, Applications, and Open Challenges. IEEE Access 2022, 10, 4209–4251. [Google Scholar] [CrossRef]
  3. Fillatreau, P.; Fourquet, J.Y.; Le Bolloc’H, R.; Cailhol, S.; Datas, A.; Puel, B. Using virtual reality and 3D industrial numerical models for immersive interactive checklists. Comput. Ind. 2013, 64, 1253–1262. [Google Scholar] [CrossRef] [Green Version]
  4. He, Z.; Rosenberg, K.T.; Perlin, K. Exploring configuration of mixed reality spaces for communication. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
  5. Beier, G.; Ullrich, A.; Niehoff, S.; Reißig, M.; Habich, M. Industry 4.0: How it is defined from a sociotechnical perspective and how much sustainability it includes—A literature review. J. Clean. Prod. 2020, 259, 120856. [Google Scholar] [CrossRef]
  6. Juřík, V.; Herman, L.; Kubíček, P.; Stachoň, Z.; Šašinka, Č. Cognitive aspects of collaboration in 3D virtual environments. In Proceedings of the International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, Prague, Czech Republic, 12–19 July 2016; Volume 41, pp. 663–670. [Google Scholar] [CrossRef]
  7. Connect 2021: Nuestra Visión del Metaverso. 2022. Available online: https://about.fb.com/ltam/news/2021/10/connect-2021-nuestra-vision-del-metaverso/ (accessed on 25 January 2022).
  8. Carruth, D.W. Virtual reality for education and workforce training. In Proceedings of the 2017 15th International Conference on Emerging eLearning Technologies and Applications (ICETA), Stary Smokovec, Slovakia, 26–27 October 2017; pp. 1–6. [Google Scholar] [CrossRef]
  9. Santos, K.; Loures, E.; Piechnicki, F.; Canciglieri, O. Opportunities assessment of product development process in Industry 4.0. Procedia Manuf. 2017, 11, 1358–1365. [Google Scholar] [CrossRef]
  10. Landherr, M.; Schneider, U.; Bauernhansl, T. The Application Center Industrie 4.0—Industry-driven manufacturing, research and development. Procedia Cirp 2016, 57, 26–31. [Google Scholar] [CrossRef]
  11. Rupp, M.; Schneckenburger, M.; Merkel, M.; Rainer Börret, R.; Harrison, D. Industry 4.0: A Technological-Oriented Definition Based on Bibliometric Analysis and Literature Review. J. Open Innov. Technol. Mark. Complex. 2021, 7, 68. [Google Scholar] [CrossRef]
  12. Quiroga-Parra, D.J.; Torrent-Sellens, J.; Murcia-Zorrilla, C.P. Las tecnologías de la información en América Latina, su incidencia en la productividad: Un análisis comparado con países desarrollados. Dyna 2017, 84, 281–290. [Google Scholar] [CrossRef]
  13. Xu, L.D.; Xu, E.L.; Li, L. Industry 4.0: State of the art and future trends. Int. J. Prod. Res. 2018, 56, 2941–2962. [Google Scholar] [CrossRef] [Green Version]
  14. Zhong, R.Y.; Xu, X.; Klotz, E.; Newman, S.T. Intelligent Manufacturing in the Context of Industry 4.0: A Review. Engineering 2017, 3, 616–630. [Google Scholar] [CrossRef]
  15. Lu, Y. Industry 4.0: A survey on technologies, applications and open research issues. J. Ind. Inf. Integr. 2017, 6, 1–10. [Google Scholar] [CrossRef]
  16. Zaimovic, T. Setting speed-limit on Industry 4.0—An outlook of power-mix and grid capacity challenge. Procedia Comput. Sci. 2019, 158, 107–115. [Google Scholar] [CrossRef]
  17. Roldán, J.J.; Crespo, E.; Martín-Barrio, A.; Peña-Tapia, E.; Barrientos, A. A training system for Industry 4.0 operators in complex assemblies based on virtual reality and process mining. Robot. Comput. Integr. Manuf. 2019, 59, 305–316. [Google Scholar] [CrossRef]
  18. Ottogalli, K.; Rosquete, D.; Amundarain, A.; Aguinaga, I.; Borro, D. Flexible framework to model Industry 4.0 processes for virtual simulators. Appl. Sci. 2019, 9, 4983. [Google Scholar] [CrossRef] [Green Version]
  19. Liagkou, V.; Salmas, D.; Stylios, C. Realizing virtual reality learning environment for industry 4.0. Procedia CIRP 2019, 79, 712–717. [Google Scholar] [CrossRef]
  20. Büchi, G.; Cugno, M.; Castagnoli, R. Smart factory performance and Industry 4.0. Technol. Forecast. Soc. Chang. 2020, 150, 119790. [Google Scholar] [CrossRef]
  21. Shi, Z.; Xie, Y.; Xue, W.; Chen, Y.; Fu, L.; Xu, X. Smart factory in Industry 4.0. Syst. Res. Behav. Sci. 2020, 37, 607–617. [Google Scholar] [CrossRef]
  22. Jones, M.D.; Hutcheson, S.; Camba, J.D. Past, present, and future barriers to digital transformation in manufacturing: A review. J. Manuf. Syst. 2021, 60, 936–948. [Google Scholar] [CrossRef]
  23. Żywicki, K.; Zawadzki, P.; Górski, F. Virtual reality production training system in the scope of intelligent factory. In Proceedings of the International Conference on Intelligent Systems in Production Engineering and Maintenance, Wroclaw, Poland, 17–18 September 2017; Springer: Cham, Switzerland, 2017; pp. 450–458. [Google Scholar] [CrossRef]
  24. Wittenberg, C.; Bauer, B.; Stache, N. A smart factory in a laboratory size for developing and testing innovative human-machine interaction concepts. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Washington, DA, USA, 25–29 July 2019; Springer: Cham, Switzerland, 2019; pp. 160–166. [Google Scholar] [CrossRef]
  25. Damiani, L.; Demartini, M.; Guizzi, G.; Revetria, R.; Tonelli, F. Augmented and virtual reality applications in industrial systems: A qualitative review towards the industry 4.0 era. IFAC-PapersOnLine 2018, 51, 624–630. [Google Scholar] [CrossRef]
  26. Radhakrishnan, U.; Koumaditis, K.; Chinello, F. A systematic review of immersive virtual reality for industrial skills training. Behav. Inf. Technol. 2021, 40, 1310–1339. [Google Scholar] [CrossRef]
  27. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
  28. Jensen, L.; Konradsen, F. A review of the use of virtual reality head-mounted displays in education and training. Educ. Inf. Technol. 2018, 23, 1515–1529. [Google Scholar] [CrossRef] [Green Version]
  29. Ozcinar, C.; Smolic, A. Visual attention in omnidirectional video for virtual reality applications. In Proceedings of the 2018 Tenth International Conference on Quality of Multimedia Experience (QoMEX), Cagliari, Italy, 29 May–1 June 2018; pp. 1–6. [Google Scholar] [CrossRef]
  30. Sutcliffe, A.G.; Poullis, C.; Gregoriades, A.; Katsouri, I.; Tzanavari, A.; Herakleous, K. Reflecting on the design process for virtual reality applications. Int. J. Hum. Comput. Interact. 2019, 35, 168–179. [Google Scholar] [CrossRef]
  31. Scavarelli, A.; Arya, A.; Teather, R.J. Virtual reality and augmented reality in social learning spaces: A literature review. Virtual Real. 2021, 25, 257–277. [Google Scholar] [CrossRef]
  32. Guo, Z.; Zhou, D.; Chen, J.; Geng, J.; Lv, C.; Zeng, S. Using virtual reality to support the product’s maintainability design: Immersive maintainability verification and evaluation system. Comput. Ind. 2018, 101, 41–50. [Google Scholar] [CrossRef]
  33. Lee, L.H.; Braud, T.; Zhou, P.; Wang, L.; Xu, D.; Lin, Z.; Kumar, A.; Bermejo, C.; Hui, P. All one needs to know about metaverse: A complete survey on technological singularity, virtual ecosystem, and research agenda. arXiv 2021, arXiv:2110.05352. [Google Scholar]
  34. Ning, H.; Wang, H.; Lin, Y.; Wang, W.; Dhelim, S.; Farha, F.; Ding, J.; Daneshmand, M. A survey on metaverse: The state-of-the-art, technologies, applications, and challenges. arXiv 2021, arXiv:2111.09673. [Google Scholar]
  35. Seok, W.H. Analysis of Metaverse Business Model and Ecosystem. Electron. Telecommun. Trends 2021, 36, 81–91. [Google Scholar] [CrossRef]
  36. Song, S.W.; Chung, D.H. Explication and Rational Conceptualization of Metaverse. Informatiz. Policy 2021, 28, 3–22. [Google Scholar] [CrossRef]
  37. Duan, H.; Li, J.; Fan, S.; Lin, Z.; Wu, X.; Cai, W. Metaverse for social good: A university campus prototype. In Proceedings of the 29th ACM International Conference on Multimedia, Virtual Event, 20–24 October 2021; pp. 153–161. [Google Scholar] [CrossRef]
  38. Anacona, J.D.; Millán, E.E.; Gómez, C.A. Application of metaverses and the virtual reality in teaching. Entre Cienc. Ing. 2019, 13, 59–67. [Google Scholar] [CrossRef] [Green Version]
  39. Suzuki, S.N.; Kanematsu, H.; Barry, D.M.; Ogawa, N.; Yajima, K.; Nakahira, K.T.; Yoshitake, M. Virtual Experiments in Metaverse and their Applications to Collaborative Projects: The framework and its significance. Procedia Comput. Sci. 2020, 176, 2125–2132. [Google Scholar] [CrossRef]
  40. Chen, C.J.; Toh, S.C.; Fauzy, W.M. The Theoretical Framework for Designing Desktop Virtual Reality-Based Learning Environments. J. Interact. Learn. Res. 2004, 15, 147–167. [Google Scholar]
  41. Steffen, J.H.; Gaskin, J.E.; Meservy, T.O.; Jenkins, J.L.; Wolman, I. Framework of Affordances for Virtual Reality and Augmented Reality. J. Manag. Inf. Syst. 2019, 36, 683–729. [Google Scholar] [CrossRef]
  42. Kim, W.S. Edge Computing Server Deployment Technique for Cloud VR-based Multi-User Metaverse Content. J. Korea Multimed. Soc. 2021, 24, 1090–1100. [Google Scholar] [CrossRef]
  43. He, Z.; Du, R.; Perlin, K. CollaboVR: A Reconfigurable Framework for Creative Collaboration in Virtual Reality. In Proceedings of the 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Porto de Galinhas, Brazil, 9–13 November 2020; pp. 542–554. [Google Scholar] [CrossRef]
  44. Chen, X.; Wang, M.; Wu, Q. Research and development of virtual reality game based on unreal engine 4. In Proceedings of the 2017 4th International Conference on Systems and Informatics (ICSAI), Hangzhou, China, 11–13 November 2017; pp. 1388–1393. [Google Scholar] [CrossRef]
  45. Unreal Engine: The Most Powerful Real-Time 3D Creation Tool. 2021. Available online: https://www.unrealengine.com/en-US/ (accessed on 25 January 2022).
  46. Urrutia, G.A.M.; López, C.E.N.; Martínez, L.F.F.; Corral, M.A.R. Procesos de desarrollo para videojuegos. Cult. Cient. Tecnol. 2015, 37, 25–39. Available online: https://148.210.21.18/ojs/index.php/culcyt/article/view/299 (accessed on 25 January 2022).
  47. Gómez Sánchez, M. Test de usabilidad en entornos de Realidad Virtual. No Solo Usabilidad 2018, 17. Available online: https://www.nosolousabilidad.com/articulos/test_usabilidad_realidad_virtual.htm (accessed on 25 January 2022).
  48. Schmeil, A. Designing collaboration experiences for 3D virtual worlds. Comput. Sci. 2012, 226, 209. Available online: https://www.semanticscholar.org/paper/Designing-collaboration-experiences-for-3D-virtual-Schmeil/8f0c4910636c8cfc1a6033c1e23467aafa1e301d (accessed on 25 January 2022).
  49. Lee, H.; Woo, D.; Yu, S. Virtual Reality Metaverse System Supplementing Remote Education Methods: Based on Aircraft Maintenance Simulation. Appl. Sci. 2022, 12, 2667. [Google Scholar] [CrossRef]
  50. Rebollo, C.; Gasch, C.; Remolar, I.; Delgado, D. Learning First Aid with a Video Game. Appl. Sci. 2021, 11, 11633. [Google Scholar] [CrossRef]
  51. Kim, H.K.; Park, J.; Choi, Y.; Choe, M. Virtual reality sickness questionnaire (VRSQ): Motion sickness measurement index in a virtual reality environment. Appl. Ergon. 2018, 69, 66–73. [Google Scholar] [CrossRef]
  52. Alpala, L.O.; Alemany, M.D.M.E.; Peluffo-Ordoñez, D.H.; Bolaños, F.; Rosero, A.M.; Torres, J.C. Methodology for the design and simulation of industrial facilities and production systems based on a modular approach in an “Industry 4.0” context. DYNA 2018, 85, 243–252. [Google Scholar] [CrossRef]
Figure 1. The unreal engine experiment framework.
Figure 1. The unreal engine experiment framework.
Applsci 12 06258 g001
Figure 2. VR development methodology.
Figure 2. VR development methodology.
Applsci 12 06258 g002
Figure 3. Usability testing methodology.
Figure 3. Usability testing methodology.
Applsci 12 06258 g003
Figure 4. Manufacturing the plant case study—digital factory metaverse.
Figure 4. Manufacturing the plant case study—digital factory metaverse.
Applsci 12 06258 g004
Figure 5. The avatar-based collaboration multi-user framework: (a) shows the process overview of VR form metaverse content; and (b) shows how the avatars represent users in the virtual world.
Figure 5. The avatar-based collaboration multi-user framework: (a) shows the process overview of VR form metaverse content; and (b) shows how the avatars represent users in the virtual world.
Applsci 12 06258 g005
Figure 6. Some Industry 4.0 technologies’ implementation in the smart factory case study: (a) sensors, actuators, and radio frequency identification (RFID) in conveyors; (b) realistic representation of a storage system using digital twin; (c) autonomous robots collaboratively working on repetitive tasks; (d) big data and data analytics in real time in the process; (e) human–machine interface with interfaces and monitors; (f) maintenance of machinery and equipment. A demo is available at: https://sdas-group.com/gallery/ (accessed on 6 May 2022).
Figure 6. Some Industry 4.0 technologies’ implementation in the smart factory case study: (a) sensors, actuators, and radio frequency identification (RFID) in conveyors; (b) realistic representation of a storage system using digital twin; (c) autonomous robots collaboratively working on repetitive tasks; (d) big data and data analytics in real time in the process; (e) human–machine interface with interfaces and monitors; (f) maintenance of machinery and equipment. A demo is available at: https://sdas-group.com/gallery/ (accessed on 6 May 2022).
Applsci 12 06258 g006
Figure 7. VR testing functionality and playability. (a) digital factory metaverse app home screen; and (b) users testing the application with VR in laboratory.
Figure 7. VR testing functionality and playability. (a) digital factory metaverse app home screen; and (b) users testing the application with VR in laboratory.
Applsci 12 06258 g007
Figure 8. Usability testing: (a) four applicable parameters of the metrics; and (b) VR usage time.
Figure 8. Usability testing: (a) four applicable parameters of the metrics; and (b) VR usage time.
Applsci 12 06258 g008
Table 1. Controller styles with functionalities.
Table 1. Controller styles with functionalities.
ControllersFunction
Motion controllersMotion controllers are designed to represent physical controllers in VR experiences. Each controller consists of a motion controller and several motion components. Applsci 12 06258 i001
Laser motion controllerIt equips the motion controller with a laser (implemented as a motion component) as the primary means of interaction with other players. The laser motion component connects a laser pointer to the controller which emits a laser trace and provides the player with the possibility of remote interaction. Applsci 12 06258 i002
Hands motion controllerIt enables the motion controller to appear with a skeletal mesh of a hand. However, the hand motion controller primarily serves as the main class for all hand motion controllers. Applsci 12 06258 i003
Grab motion componentIt allows the motion controller to grab or grasp the actors. When virtual reality hands attempt to grab or grasp an actor, the hand searches the actor’s mesh for sockets and matches the most appropriate socket of the motion controller. Applsci 12 06258 i004
Radial menuThe radial menu motion component generates a circular set of buttons that can be selected by the same hand’s thumb or the other hand’s laser. The radial menu is a VR-only user interface. Applsci 12 06258 i005
Table 2. VR Usability Testing.
Table 2. VR Usability Testing.
PlanningTest DevelopmentTest Results
Test goalsPerforming online mutual playback tests for a production plant with smart factoryResource preparationParticipants: prior accompaniment and training was available for each user. Participants were divided into 3 groups to participate in different sessions approximately in 2 h duration.User experience analysisFunctionality and playability
User profileEngineering students in total 20 participantsEquipment: a specialized VR laboratory was available for testing with a complete equipment kit, a total of 6 pieces of equipment. For users residing at home, the use of their personal computers was available.Improvement strategy
Test settingLaboratory and users’ places of residenceUser receptionUsers were trained as a group on the use of controls, multi-user and the immersive VR experience prior to the test.Metrics evaluation
EquipmentSpecialized VR equip-ment as detailed in Section 5.2.2.TestingEach user had a VR device connected to the Internet, and the application was run through a session created by the main tutor. Each user within the groups experienced real-time immersion into the virtual world to work on team tasks.Conclusions
DocumentationUsability document
MetricsMetrics
Table 3. Applications of Industry 4.0 technologies in the case study project.
Table 3. Applications of Industry 4.0 technologies in the case study project.
TechnologiesType of Specific Applications Integrated in the Factory
Radio frequency identification (RFID)Electronic sensors and actuators, RFID integrated into machines.
Cyber-physical Systems (CPS)Connected smart factory in real time with all process.
Big data and data analytics (BD)Management and analysis of big production data.
Cloud computingExternal data storage, in the so-called cloud, with fast response capability.
Human–machine interface (HMI)Interfaces and monitors in the process.
Manufacturing execution system (MES)Control of the main processes through real-time connection.
Computer-aided maintenance management (CMMS)Maintenance of machinery and equipment.
Collaborative platformsIntegration of different platforms in real time.
Augmented reality, virtual reality and simulationUse of VR and simulation for training and education with users.
Artificial intelligence (AI)People, machines, processes, and transport systems are controlled with AI.
Digital twinRepresentation of real factory to virtual factory in installations, machines, and products.
Collaborative roboticsIntegration of different types of robots for repetitive tasks.
Table 4. Metrics definition using the VR application.
Table 4. Metrics definition using the VR application.
User Experience
LearningAssess how quickly the user learns to operate the device and how this process can be improved.
HelpEvaluate when the user requires help in handling the device and what is the most effective way to transmit it without influencing their actions.
Application functionalityIn terms of interactivity, walkthroughs, multimedia, and object simulations.
3D virtual worlds with VR vs. other technologiesUser experience with different 3D simulation technologies.
Visualization of 2D and 3D graphics in the applicationThe perception of the quality of the graphics experienced by the user will be assessed.
Navigation and interactionSince VR is still new to most users, the user’s experience with the application will be evaluated.
Content developed for smart factoryThe immersive experience of the user on the virtual world scenario will be valued according to reality.
Preparation and Control of Devices
Test locationLaboratory or user’s home.
Critical errorsRelated to planning, VR equipment, space, and staff logistics.
Handling of controlsThe learning and adequate use of the controls for the different functions of the application will be valued.
Use of VR equipmentThe comfort and handling and FPs of all VR equipment will be assessed.
Multi-user
CommunicationThe most effective form of communication between users, inside and outside the application, will be valued.
Team collaborationEffective collaboration to perform tasks as a team will be valued.
Multi-user connectivityStable Internet connectivity for multimedia playback and online multi-user connection.
AvatarsUser’s perception of the avatar within the virtual world.
Immersion Effects
General discomfortThe participant presents general body discomfort during and after the test.
VertigoSensation is perceived by the participant, due to light movements and heights inside the VR application.
SweatingThe participant has perspiration on their forehead, hands, and other parts of the body during the test.
NauseaSensation of feeling like vomiting during or after the test.
FatigueFatigue of parts of the body due to standing for a period of time.
Stomach awarenessSensation of feeling dizziness or discomfort in the stomach during or after the test.
Difficulty focusingThe participant has difficulty concentrating on the test.
Blurred visionThe participant has blurred vision to adequately visualize the graphics on the VR device lenses.
Table 5. Main metrics evaluation results.
Table 5. Main metrics evaluation results.
User experienceThe first metric evaluated is the user experience with VR, which can analyze whether the participants who had a previous experience of the use of VR and video games demonstrated a higher satisfaction of the gameplay in contrast to the other groups. See Figure 8a.
The evaluation included 20 participants. The most interesting information obtained from the tests is that 88% of the males had good response in terms of user experience and playability, while 70% of the females had an affirmative response.
On average, men spent 2.8 h using VR, while women averaged 2.4 h.
In terms of meeting expectations, all participants would use VR again as a training and coaching practice.
Preparation and control of devicesIn the second metric preparation and the control of the devices and start of the application, it can be analyzed that the users who had previous experience had less preparation and learning time, and for the first group, difficulty in the preparation was totally new and their adaptation was possible in several sessions with the VR application. See Figure 8a.
The evaluation included 20 participants. The most interesting data obtained from the tests is that 95% of men and 89% of women were able to quickly adapt to the preparation and control of VR devices.
For the new participants who used the VR devices for the first time, learning took longer; however, for the second time they used the VR devices, everything went faster and without complications.
Multi-userFor metric 3 (multi-user), the group experience, communication, collaboration, connectivity, and avatars were evaluated, and this metric in its analysis shows that all students made a favorable evaluation according to their experience. See Figure 8a.
The most interesting data obtained from the tests is that 93% of males and 90% of females had a good experience using the multi-user mode both at the level of communication and group collaboration.
Multi-user proved to be very practical for group practices in metaverse, as all participants stated that during the test session, they could feel a realism reminiscent of the real world.
Immersion effectThe metrics defined for immersion effects were performed taking as reference some of the items of the Virtual Reality Sickness Questionnaire (VRSQ) presented in [51]. Particularly, since our study focuses on the proposal of a VR system framework for an experimental metaverse, the most approximate items in the general evaluation of the questionnaire proposed in this research were considered, which included general discomfort, dizziness, sweating, nausea, fatigue, stomach sensitization, difficulty focusing, and blurred vision.
The fourth metric of effects caused by immersion in VR was evaluated for the side effects that VR may cause after use. The evaluation involved 20 participants; the most interesting data obtained from the tests are:
The analysis showed that the participants of the third group mostly did not present dizziness or serious side effects; on the contrary, the first group which did not have any experience with VR equipment and video games, presented slight sensations of dizziness and eye fatigue, which is also due to the time of use of the VR goggles as shown in (Figure 8a,b). Some side effects caused by VR devices may disappear after then frequent use of VR, which was a manifestation of the more experienced participants who had been using these devices for a long time.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Alpala, L.O.; Quiroga-Parra, D.J.; Torres, J.C.; Peluffo-Ordóñez, D.H. Smart Factory Using Virtual Reality and Online Multi-User: Towards a Metaverse for Experimental Frameworks. Appl. Sci. 2022, 12, 6258. https://doi.org/10.3390/app12126258

AMA Style

Alpala LO, Quiroga-Parra DJ, Torres JC, Peluffo-Ordóñez DH. Smart Factory Using Virtual Reality and Online Multi-User: Towards a Metaverse for Experimental Frameworks. Applied Sciences. 2022; 12(12):6258. https://doi.org/10.3390/app12126258

Chicago/Turabian Style

Alpala, Luis Omar, Darío J. Quiroga-Parra, Juan Carlos Torres, and Diego H. Peluffo-Ordóñez. 2022. "Smart Factory Using Virtual Reality and Online Multi-User: Towards a Metaverse for Experimental Frameworks" Applied Sciences 12, no. 12: 6258. https://doi.org/10.3390/app12126258

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop