1. Introduction
Virtual Reality (VR) technology is an innovative technology developed in the 20th century. The scale of VR consumers continues to increase. In 2018, CCS-Insight reported that the value of VR devices sold will increase from US
$1.5 billion in 2017 to US
$9.1 billion by 2021 [
1,
2]. The VR technologies provide innovative Information and Communication Technologies (ICTs) to improve the user experience in interacting with the environment [
1,
2]. It generates immersive customer experiences with the combination of virtual-physical touchpoints [
2]. VR has three primary characteristics: 360-degree, real-time and interactive [
3].
360-degree: Speicher concludes that a 360-degree experience is the key feature of VR [
2,
4]. 360-degree experiences do not barely reflect the number of degrees, it means that the VR device can provide an immersive virtual environment (VE) to users [
2]. Furthermore, users will be largely immersed in the VE in visualization and motion [
2].
Real-time: Each behavior of the 3D entities will run in real-time [
4]. The users will feel in a ‘realistic’ environment in a pseudo-natural manner [
2,
3].
Interactive: The system will provide the functionalities for users to act on the behavior in VE such as movement and manipulating [
2,
3].
Due to the characteristics, VR technology opens a new horizon in the traditional game industry. It shows enormous potential in the educational, commercial, and medical fields [
2]. VR can use its immersive characteristics to improve communication efficiency [
3]. Moreover, VR technologies lead users to interact with the virtual environment (VE) to become active participants [
4,
5]. The interactive environment can train users’ reflection ability. VR and its derivative technologies have the following application advantages in the application process in the educational field. At first, users can use VR equipment to self-learn these teaching resources anytime and anywhere [
3,
6]. Educators do not need to spend a long time explaining detailed knowledge by traditional education mode [
7,
8]. Secondly, VR technology can enable users to obtain a more realistic learning experience. In some courses, the teaching content displays in a 3D layout. Users wear VR devices to view this content from different perspectives in a virtualized environment [
1,
9]. The learning becomes immersive. Thirdly, VR and its derivative technologies can stimulate students’ interest in learning. Using Augment Reality (AR) and VR technology, users’ senses such as vision, touch, and hearing can be integrated to get a real situational experience and feel the immersive experience. The educational content becomes perceptible, interactive and even touchable. It effectively getting rid of the boring learning atmosphere in the past [
8,
9].
Considered the VR advantages in the educational field, Kirkley built up the VR education conceptual framework in 2005 and illustrated the possible views in the future VR industry [
5]. She put forward two key elements in VR education: learning environments and training technologies [
5]. Learning environments focus on how to build the VE [
5]. Training technologies pay more attention to the interaction between virtual and reality [
5]. Based on Kirkley’s framework, Garrett developed the virtual training system in the mining industry in 2012 [
6]. Besides, Santoso designed the AR application to assist students’ spatial cognitive ability in 2012 [
7]. Our research shows that more and more organizations use VR technology to improve their training and education system [
2].
From 2020, the COVID-19 outbreak has brought severe adverse effects globally [
10,
11,
12]. World Meter shows that it has spread to 223 countries and regions with 215,477,242 cases and 4,488,570 deaths till 27th August 2021 [
13]. Many medical scientists start to analyze how the global society has poor performance in stopping the COVID-19 outbreak. In Ethiopia, Negera analyzed the Ethiopian population structure and extensive group investigation [
10]. The result shows that the Ethiopians in the countryside do not have sufficient knowledge and oppose quarantine policy due to their poor economic circumstances [
10]. At the same time, the urban population has independent knowledge level predictors. They support the government policy so that the urban performance is much better than the countryside [
10]. Negera highly recommends that the government fill the knowledge gap and persuade people to follow the preventive measures [
10]. The COVID-19 knowledge gap does not only appear in developing countries. Developed counties also have similar circumstances. Geldsetzer started an investigation to assess public knowledge and perceptions about COVID-19 in the United States (USA) and United Kingdom (UK) at 2020 [
14]. The report shows that the public has a better education background than developing countries. They also generally know the primary mode of disease transmission and common symptoms [
14]. However, many people still have many misconceptions about COVID-19. For instance, there are only 40% USA and 32% UK people believe that a face mask can protect you from getting infected with COVID-19 [
14]. Geldsetzer believes that the government needs to correct public misconceptions and provide relevant knowledge to the public [
14]. Considering the above data, it is evident that government agencies and public organizations should provide adequate COVID-19 knowledge and information to the public so that it could assist society in facing the outbreak. Therefore, how to improve emergency preparedness and response capabilities becomes a concern of the global public health community.
The traditional epidemic education to the public mainly relies on national science and health documents. For instance, the USA CDC standard healthy courses include reading (book/online re-sources/poster) and Poisoned Picnic [
15]. It is significant that reading plays an important role in the courses. There is only one interactive session which is Poisoned Picnic. The session let the student join as an active participant to join the practice. However, the session cannot run without teachers and enough participants. Therefore, there are significant time and participants limitations for traditional education. It is challenging to attract the youth and countryside population. The youth requires interactive elements while the countryside population has location limitations. The medical education simulation software developed using VR technology is a new educational methodology [
12]. The VR technology can provide more entertainment and interactive elements to attract them. The VR application does not have a location limitation. It can run anytime and anywhere.
In this paper, the research aims to design a VR education system that can apply to government agencies and education organizations with the aim of furthering public understanding of the Epidemic (COVID-19). The proposed tool is based on entertainment interaction while there is no time or location limitation (
Figure 1). The characteristics are as follows:
The VR education system uses the Virtual Storytelling Technology (VST) to build the system structure and provide users with active exploratory experiences.
The system balances the entertainment and education elements. The research covers the fundamental Epidemic (COVID-19) knowledge from various medical articles. Meanwhile, entertainment interactions are integrated with the exploring process to enhance the user’s immersive experience.
The system uses Unreal 4 as the development platform. By writing blueprint scripts, it can efficiently design VR/desktop interactions by adopting various platforms criteria.
Our research has released some position outcomes. In the 4th International Conference on Control and Computer Vision (ICCCV 2021), the research presented how to design the 3D environment and UI to adapt to the VR education system [
9]. Compared to the previous article, the research moves much further. The research has built up the system structure with VST and organized the user test experiment. In the following sections, the research discusses the research methodologies, system overview and experiments.
3. System Overview
By analyzing the public needs and the main framework of the VTS, the logical architecture of the proposed system is shown in
Figure 3. The structure refers from the storyline design. Firstly, there are five major sections: menu, tutorial, personal preparation, onsite investigation and case submission. Secondly, different sections are using different scenes. Menu and case submission section are in the traditional 2D panel. Tutorial, personal preparation and onsite investigation is in VE. Thirdly, there are two major scenes: medical center and case environment. The scene simulates a standard medical center with a meeting room, equipment room, front desk and ambulance section.
The system is a VR application system based on desktops, which provide COVID-19 knowledge to users in exploring the VE. Thus, it can provide users with an immersive experience by combining VR interactive technology. Unreal 4 has strong compatibility in supporting various VR and desktop video games. Therefore, the VR educational system uses Unreal 4 Engine to develop. The sub-sections cover the system details include the interactive mode and function modules.
3.1. Interactive Mode
The interactive mode of this system can be divided into the operation mode and user interface.
3.1.1. Operation Mode
The system’s primary interactive methodology is based on VR technologies. Across the world, although XR devices receive high sales, it still occupies a low percentage compared to desktop and mobile phones [
1,
9,
23]. Stream reports that XR devices only achieve 2.13% in Stream 2021 global users [
9,
23]. If the system only supports the VR platform, it cannot complete the primary target of popularizing virtual simulation education. Therefore, the system also provides the traditional keyboard/mouse operation mode to extend the user range.
Table 1 shows the operation input instructions of the computer and VR. The system uses Oculus Touch as the sample VR controlling instruction. The desktop controlling mode can achieve the same interactive result as VR controllers by keyboard/mouse.
The system provides various interaction methodologies.
Figure 4 illustrates the primary interactive samples.
Figure 4a displays how to interact with the medical package [
9]. At first, the system allows the user to pick up the medical tools by VR controller first trigger. The users can press the controller Button One to turn on the selected tool [
9].
Figure 4b shows how to use a thermometer to measure the close contact’s temperature. When the right hand controlling aims the thermometer to the head of the close contact and clicks the right-hand trigger button, the user can successfully measure the body temperature and start the storyline conversation to the contact.
3.1.2. User Interface
User Interface (UI) is the medium between the user control and the system [
1]. The UI experience is a standard that reflects a good user experience [
24]. Considered the characteristic of VR device input, it is necessary to provide a significant and straightforward UI to guide the user in exploring the system [
1]. The UI section covers three primary aspects: menu page, head-up display, environmental indication.
Figure 5 displays the menu page UI.
Figure 5a illustrates the loading page and
Figure 5b is the paused menu page. The research develops the shape, size and color of the UI controls with the flat design. The UI style is clear and easy to guide the user to move to the exploring process.
Figure 6 displays the gaming UI.
Figure 6a shows the user measures the close contact’s temperature by wearing the complete protective equipment. The mask and eye protection icon display on the right bottom side. When the user takes off the equipment, the icons will disappear and warn the user.
Figure 6b displays the tool icon. When the user takes the medical package, the medical tool icons appear on the left side to let the user check the tool list and select the correct tool to use or wear. The icons list with a slight angle rotation to generate the depth perspective. Besides the UI icon indication, the system uses the 3D object in the scene to indicate the user to move the target position (
Figure 6b). Both 2D and 3D UI can assist the user to avoid lost in the VE and guide them to complete the tasks.
3.2. Function Modules
This section focuses on the main modules of the system which scene is in the VE. The main modules are the essential section to use VR technology to convey the COVID-19 knowledge to the user. The main modules of this system can be divided into the Tutorial Module, Preparation Module and Investigation Module. The details of the main modules are shown in
Figure 7.
The Tutorial Module aims to assist the user to understand the basic interaction and use the storyline to attract the user as an active participant. The Preparation Module educates the user in individual protection and related medical protection knowledge. The Investigate Module explores the investigation process. The user can learn how the medical staff deal with the diseases. It is beneficial for the user to cooperate with medical staff in daily life. Furthermore, the module educates environmental cleaning knowledge.
3.2.1. Tutorial Module
In order the assist the user in understanding the movement and interaction in the system, it is necessary to build the tutorial module as the first VR module in the system. The module has three sections: Controlling Tutorial, Pick up Phone Call and New Case Response. The controlling tutorial is at the front desk in the medical center. Furthermore, the research uses the plot-based methodology to guide the user to move to the next section. Notifying to pick up the front desk phone call, it moves to the new case response section. It provides the story background to help the user enter into the storyline as an active participant.
Figure 8 displays the exploring flow of the tutorial module.
Figure 9a shows the rendering image of the movement tutorial. The system will display the text notification to guide the user. When the user completes the current movement and the interaction instruction, it will move to the next interaction notification till the tutorial completed. The front desk phone will ring after the user finish the controlling tutorial (
Figure 9b). The phone section leads the user to trigger the plot. The trigger point can enhance the user participant and the storyline can smoothly move.
Figure 9c displays the new case response section in the meeting room. This section is an animation to introduce the new case information. After playing the animation, the user moves to the preparation module.
3.2.2. Preparation Module
The preparation module is the second VR module in the system. The module aims to educate the user about medical equipment and individual protection knowledge.
Figure 10 displays the exploring flow of the preparation module. After the tutorial module, the user needs to move to the equipment room. At first, the user collects the equipment in the room. Secondly, the user wears personal protection includes googles, mask, face shield, gloves and gown. The system provides the 3D UI panel as
Figure 11b. The user cannot leave the room without completing the wearing section. After the wearing section, the user receives the notification and moves to the back door. The departure animation transfers the user to the investigation module.
Compared to the tutorial module, the preparation module has more interactive elements.
Figure 11a shows how to collect the related materials in the equipment room. The research uses the highlight color to simplify the collecting process. When the user enters the equipment room, the system indicates the user to interact with the medical package. The system notifies the user to collect the suitable material from the warehouse to the medical package. The system highlights the tools to help the user to find. Besides, the 3D wearing section refers to the game character dressing. After completing the collection, the system indicates to start the individual protection section.
Figure 11a shows the 3D wearing panel. The user needs to move the equipment to the correct corresponding position. After the section completed, the equipment room door opens and the user can move to the back door to trigger the storyline. Both two sections use interactive game methods to improve the user experience.
3.2.3. Investigation Module
The investigation module is the third VR module in the system. The module aims to educate the user about close contact and environment cleaning knowledge.
Figure 12 displays the exploring flow of the investigation module. The user moves to the close contact room after the ambulance departure animation. The contact is in the bedroom. The first step is communicating with contact and measuring his temperature. After the investigation, the contact leaves the room and moves to the health facility. The user continues the investigation in the room. The system indicates the user collects the sample in the bedroom and bathroom. After the collection, the user starts the environment cleaning. The user can use the disinfection spray to clean the room under the instruction. When the investigation completed, the animation drives the user to the case submission module.
The module has three major interactive sections, which are contact investigation, sample collection and environment cleaning. The research refers to VTS and designs simple and attractive interactive methods to improve the user experience in the sections. The module reduces many unnecessary procedure contents and reserves the core and fundamental knowledge to the user.
Figure 13a the communication with close contact. The user can select different questions to ask the contact. The game branch dialogue can enhance the participant for the user. Measure temperature is a small interactive game. The user needs to use the thermometer to aim at the contact’s forehead (
Figure 4b). The thermometer cannot display the correct temperature if the user cannot aim at the correct target. The section educates the user on how to use the thermometer correctly.
After the contact leaves the room, the user starts collecting the environment sample in the bathroom and bedroom (
Figure 13b). The system highlights the vital furniture. When the user closes the furniture and uses the tool to collect the sample, the system displays the correct collection notification. The cleaning section is the next step. The system notifies the user to gear the disinfection spray and clean the environment. The research designs the unclean part as the black sphere in
Figure 13c. The user should spray disinfections to the target for a while till the sphere disappear. The section uses the “capture and destroys” as the general game flow. It can attract the user to finish the cleaning target. The section assists the user in knowing how to use the disinfection spray and cleaning the environment.