Next Article in Journal
Path Embeddings with Prescribed Edge in the Balanced Hypercube Network
Previous Article in Journal
Lateralization of the Avian Magnetic Compass: Analysis of Its Early Plasticity
Open AccessArticle

A Study on Immersion and VR Sickness in Walking Interaction for Immersive Virtual Reality Applications

Department of Software, Catholic University of Pusan, Busan 46252, Korea
*
Author to whom correspondence should be addressed.
Academic Editor: Angel Garrido
Symmetry 2017, 9(5), 78; https://doi.org/10.3390/sym9050078
Received: 2 March 2017 / Revised: 16 May 2017 / Accepted: 17 May 2017 / Published: 22 May 2017

Abstract

This study analyzes walking interaction to enhance the immersion and minimize virtual reality (VR) sickness of users by conducting experiments. In this study, the walking interaction is composed of three steps using input devices with a simple structure that can be easily used by anyone. The first step consists of a gamepad control method, which is the most popular but has low presence. The second step consists of a hand-based walking control interface, which is mainly used for interaction in VR applications. The last step consists of a march-in-place detection simulator that interacts with the legs—the key body parts for walking. Four experiments were conducted to determine the degree of direct expression of intention by users in the course of walking interactions that can improve immersion, presence, and prevent VR sickness in VR applications. With regard to the experiments in this study, survey experiments were conducted for general users using the Wilcoxon test, a presence questionnaire, and simulator sickness questionnaire (SSQ). In addition, the technical performance of the VR scenes used in the experiment was analyzed. The experimental results showed that higher immersion was achieved when interactions that felt closer to real walking were provided in VR. Furthermore, it was found that even control methods with a simple structure could be used for walking interactions with minimal VR sickness. Finally, a satisfactory presence was found in VR if the user was able interact using his or her own legs.
Keywords: walking interaction; immersive virtual reality applications; immersion; VR sickness; gamepad; hand interface; walking simulator walking interaction; immersive virtual reality applications; immersion; VR sickness; gamepad; hand interface; walking simulator

1. Introduction

Virtual reality (VR) enables interactions that satisfy the five senses of users—particularly the visual, auditory, and tactile senses—in order to provide an experience similar to reality. Through these senses, immersive VR systems allow users to experience where they are, whom they are with, and what they are doing as if it were a real experience. In this context, the concept of presence refers to a phenomenon where users act and feel as if they are “really there” in a virtual world created by computer displays [1,2,3]. If a VR environment is not used properly, it may cause ailments such as headaches, dizziness, and nausea. This is called VR sickness [4]. First, in order to improve presence in virtual reality, a user’s visual sense should be satisfied by sending three-dimensional visual information to the user. Devices such as head mounted displays (HMDs) and 3D monitors are used for this purpose. Furthermore, in order to maximize user immersion, realistic interactions with VR objects and backgrounds should be provided by detecting user behavior and motion. These motions can be classified into hand-based motions such as gestures using the fingers, holding, pushing, and throwing objects, or leg-based motions such as walking on the ground. Devices such as Leap Motion accurately capture the motions and gestures of the hands and fingers and express them realistically in a virtual environment. Furthermore, treadmill-type VR devices detect walking and running motions to provide a spacious virtual environment where users can move freely despite a limited real space. Motion capture devices can detect the movements of every joint in the body through markers and cameras, and express these real human motions in a virtual environment, thereby enhancing immersion and presence.
Sutherland [5] researched the first HMD system that provided 3D VR scenes for the visual sense. Since then, studies have been conducted to implement highly immersive VR by satisfying the various senses of users, including the auditory and tactile senses [6,7,8]. A wide variety of research on user-oriented interactions in VR has been conducted to this point, such as a study on capturing the elaborate motions of hands and fingers using optical markers and surface markers, and reflecting these motions in a virtual space [9,10], a study on the improvement of immersion by coupling the sense of touching the ground with a sound [11], and a study on the realistic expression of various behaviors such as going up stairs and pushing against a wall in a virtual space [12]. Furthermore, thoughts and behaviors can change based on the environment or conditions that the user encounters in VR [13]; the presence that users experience in VR through various approaches, such as perspective and muscle activity [14,15], has also been researched. Recently, the effects of VR on users have been analyzed with various approaches, including psychology, neuroscience, and sociology [3].
As the scope of expression in VR becomes larger, the size of the virtual space also expands. However, it is difficult to provide movements similar to reality in a large virtual space because the user’s real space for experiencing VR is limited. Even though VR devices such as motion capture and tracking sensors can accurately detect walking motions and locations, and express them in a virtual space, they are limited to the size of the real room the user is in. In order to address this problem, treadmill-type VR devices have recently received significant attention. However, they have limited popularity due to their high prices and large sizes.
In this study, immersion, presence, and VR sickness due to walking interaction are analyzed using simulations in a simple structure that users operate with their legs and other input devices that are commonly used for walking in VR applications. The feasibility of providing a walking interaction that can satisfy users with an easy and simple control method is verified. The experimental environment for walking interaction in this study is configured in three steps using the same VR scene as follows:
  • Design interactions using a gamepad, which is the most popular input device in VR applications, including games.
  • Design hand interfaces based on Leap Motion to control motions using only hand gestures for direct expression of intention, which is more direct than a gamepad or keyboard.
  • Design walking interaction using an Arduino-based portable simulator [16], which can detect a march-in-place gesture to express walking motions in VR applications.
Factors that users feel, such as immersion, presence, and VR sickness when walking are measured in the virtual environment based on the three-step walking interaction. For this purpose, we first analyze the technical performance of the VR scenes used in the experiment. Immersion, presence, and VR sickness are analyzed through statistical data from surveys, including a presence questionnaire and simulator questionnaire (SSQ). Finally, based on the results of analysis, we attempt to determine what simple interaction structure should be used to provide users with a satisfying walking experience in immersive VR.

2. Previous Works

Various studies have been conducted on the implementation of highly immersive VR based on the five senses of users. Sutherland [5] proposed a system for transmitting visual information realistically in a 3D virtual environment to provide a highly immersive visual experience. Since then, various approaches have been researched for improving the immersion, presence, and realism of virtual reality, including spatial audio, joint-based motion platforms, and haptic systems [6,17,18,19].
In order to provide satisfactory presence in VR (narrowing the gap between the virtual world and real world), the hand motions of users must be expressed realistically [20,21]. Haptic systems and motion platforms not only detect the motions of human joints accurately and express them realistically in a virtual environment, but also provide feedback for physical reactions such as external forces and tactile senses. First, in a study on haptic systems, Hayward et al. [22] researched a system that interacts with a virtual environment though the hands and provides physical feedback for the tactile senses through electrical actuators. Furthermore, a hand haptic system was proposed that allows users to directly touch objects in a virtual space by freely moving their fingers [23]. Additionally, Carvalheiro et al. [18] proposed an interaction system that enables controlling virtual objects as a realistic experience by mapping real hands to virtual hands. Experiments verified that users felt the shapes of remote objects and had experiences similar to the real world through the proposed haptic system. In this manner, a haptic system was developed around the hands, which are frequently used in interactions in a virtual environment. This system requires prior research on accurate motion capture of the hands and fingers in a joint unit before interaction in virtual reality can begin. Metcalf et al. [24] proposed a system for capturing the motions of the hands and fingers based on surface markers. Zhao et al. [9] researched a hand motion capture system using a Kinect camera with optical markers. The accurate capture and simulation of hand motions and gestures were verified through experiments. Hand motion capture is still being actively researched [25]. However, actual uses of the outputs of such studies in VR applications are very rare. The current development environment of VR applications is centered on specific engines based on HMDs, and various compatibility issues must be considered. Recently, devices such as Leap Motion have emerged and been used with other development tools for applications in a structure appropriate for VR HMDs and development environments. Thus, we can design a new interface by applying techniques for the accurate detection of hand gestures and motions to create an immersive VR. We can then use these techniques interactions other than touching virtual objects, such as walking.
Studies on hand-based haptic systems are being expanded to the detection of various other body parts and feedback for physical reactions. For instance, a vibrotactile device has been proposed that provides the sensation of walking on the ground by delivering virtual ground stimulation to the feet of users [26]. Some studies have developed multimodality [27] and research has been conducted to enhance the immersion of the walking experience by synthesizing appropriate sounds when a user makes a walking motion while wearing haptic shoes [11]. The feelings of users walking on the ground while wearing the shoes were verified through experiments. However, the feedback of tactile senses during the walking process does not provide a spacious virtual environment in which users can freely move. Thus, additional research on walking in a spacious virtual environment with no spatial limitations is required. This study determined that using devices such as haptic shoes, which can be easily worn by users to capture motions rather than tactile senses, could enhance the presence of walking in VR.
A motion platform enables the realistic and diverse expression of user motions from the perspective of a spacious environment. Several studies have been conducted on enabling relatively free and diverse motions in a limited space. Cheng et al. [28] proposed Haptic Turk, which simulates the experience of flight by lifting the user with two or more actuators when the user assumes a flying position. They also conducted further studies to provide a highly immersive environment by tracking user locations and real props, and then reflecting the tracking information in VR as a user walks around an indoor space wearing a HMD [12]. They also simulated motions such as pushing against a wall, opening a door, and walking in virtual reality. The participants in the experiment evaluated whether or not they experienced higher levels of realism and immersion compared to an environment without props. Vasylevska et al. [29] proposed a flexible space that can provide infinite virtual walking in a limited real space. However, these systems have limitations for use in general applications due to their large space requirements and unclear constraints. Slater et al. [30] examined user walking behavior by using a neural network and analyzing the flow of coordinates in a HMD to enhance presence in VR. They found that this system provided greater presence compared to traditional walking controls for virtual navigation. Usoh et al. [21] also used an optical tracker and confirmed that users who were actually walking felt greater presence. Additionally, studies on illusory agency were conducted to provide information about the illusion of walking with a virtual body through the priming of behaviors in a sitting state. It was verified through experiments that this priming provided users with an experience similar to walking without requiring them to move. Recently, treadmill-type VR devices have been used to enable infinite movement in a small space. However, they are difficult to use, except in specific experiential spaces, due to their high prices. For this reasons, most VR applications still use a gamepad, keyboard, or VR pad for walking interaction. Considering these limitations, we have conducted comparative experiments between gamepads, hand motions, and march-in-place systems in order to find a walking interaction that can provide excellent presence and immersion, and which anyone can easily use and enjoy.

3. Walking Interaction in Virtual Reality Applications

3.1. Interaction Overview

We conduct experiments to find a walking interaction that can provide users with a highly immersive experience without causing VR sickness in various VR applications. The proposed walking interaction consists of three steps. We conduct surveys on immersion, presence, and VR sickness based on the proposed interaction, and analyze user feedback for walking inside immersive VR. The experimental environment was constructed by configuring a three-step walking interaction that consists of a simple walking control method using a gamepad that is commonly used in 3D interactive content, a method for matching hand gestures with walking motions without additional devices other than an HMD for improving presence in the virtual space, and an interaction method that can directly express a walking process similar to reality by attaching a portable simulator that detects march-in-place movements. Another objective of the proposed three-step interaction is to enhance the immersion of the walking process without using expensive devices. Therefore, we only conduct experiments on interaction via interfaces a with simple input method. Figure 1 illustrates the flow of the walking interaction experiment performed in this study.
For the sake of a fair experiment, the three-step walking interaction is implemented under consistent conditions in the VR scenes. The main condition if that rotation is not considered in the interaction. In order to change direction while walking, users must turn their body by using their head. Because the HMD tracks head motion, it determines the occurrence or absence of rotation without randomly or compulsorily changing the direction of the walking interaction. In other words, once the user determines the walking direction, only forward motion through the walking interaction is considered for the experiment in this study. Also, for the sake of diversity in experiments, existing commercial content from three VR concepts was purchased and used. In order to implement an experimental environment that combines the VR scene with the three-step walking interaction, a development environment combining the Unity 3D engine (Unity Technologies, San Francisco, CA, USA) and the Oculus software development kit (SDK) (Oculus, City of Irvine, CA, USA) was constructed for this study (Supplementary Video S1).

3.2. Gamepad

The first walking interaction method is the gamepad. Gamepads and keyboards are frequently used to control motions in existing content, such as games. For this reason, VR applications often use gamepads and keyboards when it is difficult to provide separate input devices. Walking should be experienced in a standing position, because immersion cannot be achieved in a sitting position. In this case, it is difficult to use a keyboard as an input device. Therefore, a gamepad is used to implement the walking interaction in a standing position based on the conventional input processing method.
In this study, the walking interaction is implemented using an Xbox 360 controller (Microsoft, Redmond, WA, USA). In order to produce VR scenes with a walking interaction when using the Unity 3D engine, the input properties of the Xbox 360 controller and Unity 3D engine must be matched with the movements of the Oculus character controller. Figure 2 illustrates the process of matching the key values from the gamepad with the input properties of the Unity 3D engine and setting the properties of the oculus virtual reality (OVR) controller, which controls character movements in the VR scenes.
The gamepad controls forward and backward walking based on scene changes and the head tracking of the HMD by using the up and down keys. This is the most passive type of interaction from a walking perspective because the user is unilaterally seeing the scene changes based on the input keys.

3.3. Hand Interface Based on Leap Motion

The next method of walking interaction is to use a hand interface based on Leap Motion. The key to this interaction method is that it does not require any devices other than the HMD. The interface is designed in such a way that motion is controlled directly by hand gestures in order to improve the immersion of the walking process with a minimal burden of equipment, and because the hand is the body part with the highest frequency of use in interaction. The second interaction was developed by passively watching scene changes after pressing a key on the gamepad to activate expression of intention via the hands. In this process, Leap Motion is used to accurately capture the movements of the hands and finger joints of the user, analyze the gestures, and reflect realistic motions in the VR application.
The Leap Motion sensors consist of two cameras that can detect hands using infrared light, and one infrared light source. These are small devices with a size of 12.7 mm × 80 mm, which can easily be attached to the front of the HMD. Thus, the user is not required to wear them separately. When a user wearing a HMD moves their hand toward the Leap Motion in front of the HMD, the motion is accurately detected through the infrared sensors. Integrating the Leap Motion SDK with the Unity 3D engine allows accurate detection of hand and finger motions, including hand capture, distinction of the left and right hands, and recognition of each finger. Furthermore, hand gestures can be defined based on these motions. Algorithm 1 defines the relationship between hand gestures and walking that is used in the second walking interaction. In this study, the actions of walking and stopping are matched to gestures.
Algorithm 1 Walking interaction of hand interface based on Leap Motion.
 1:
sequence ← capture virtual reality scene.
 2:
procedure Walking Determination(sequence)
 3:
 Finger fingers = sequence.Hands[0].Fingers (save recognized finger)
 4:
if sequence.Hands[0].IsRight == true then
 5:
  fingerCount = 0 (← the number of extended fingers)
 6:
  for i = 0, 4 do
 7:
   if fingers[i].IsExtended == true then
 8:
    fingerCount = fingerCount + 1
 9:
   end if
 10:
  end for
 11:
  if fingerCount == 5 then
 12:
   determine walking state.
 13:
  else if fingerCount == 0 then
 14:
   determine stop state.
 15:
  end if
 16:
end if
 17:
end procedure
Figure 3 presents the results of recognizing the hand of a user via Leap Motion and accurate rendering of two gestures defined in Algorithm 1 in the Unity 3D engine.
The walking interaction in step two is designed to provide a hand interface that can directly express the intention for a walking state with no separate equipment. However, it has limitations in providing perfect immersion when walking, because walking is controlled by the hands instead of the legs.

3.4. Walking Simulator Using March-in-Place

The last walking interaction method in this study is march-in-place detection using a walking simulator. The purpose of this method is to directly express the intention of walking in a limited space. The interaction method should be easily usable in general VR applications, unlike studies detecting motion using an optical tracker in a limited space [21] or studies focused on presence, where walking is detected via brain waves [31] or illusions [14]. For this purpose, the walking interaction method using march-in-place is designed based on a portable walking simulator proposed by Lee et al. [16].
For the portable walking simulator, a wearable band-type system using Arduino (Interaction Design Institute Ivrea (IDII), Ivrea, Italy) is configured to detect the march-in-place motion of a user in a simple structure without using an expensive and complicated treadmill-type device. The system structure is outlined as follows. First, an Arduino Nano mainboard is used to design a portable simulator of the wearable band-type. Then, a gyro sensor is connected to the simulator to detect the march-in-place motion. As shown in Figure 4, when a user moves forward via a march-in-place motion using the Arduino mainboard with a gyro sensor fixed to the ankle, the roll axis of the gyro sensor (consisting of yaw, pitch, and roll axes) changes the largest. When a user turns their body and changes the facing direction of their eyes, the pitch axis changes the most. Therefore, the three axes of the Arduino board and gyro sensor are fixed so that they can detect march-in-place motion through a band worn on the ankle of the user. The gyro sensor used in this study is the MPU 6050 model. Finally, a portable walking simulator is worn on each leg to detect march-in-place motion and the detected values are sent to the Unity 3D engine and reflected in the movements of the VR character. For this purpose, a Bluetooth module is connected to the Arduino. When the calculated gradient values of the roll and pitch axes from both legs are sent to the engine, walking speed and directional changes are calculated based on threshold values measured in previous experiments. These values may need to be modified because users may differ in their degree of leg movement during march-in-place motion. Figure 5 outlines the system components and the roles of the walking simulator proposed by Lee et al. [16].
The walking simulator in the third step is the interaction method that reflects the user’s intention to walk in the most direct way. However, walking interaction through march-in-place motion is an approach based on the perspective of immersion, which is different from convenience. When a user marches in place for a long time, they may become tired. It is important to find ways to enhance immersion for walking by using a march-in-place motion while simultaneously reducing the occurrence of VR sickness. Therefore, the levels of immersion and VR sickness for all the walking interaction methods are analyzed through various experiments.

4. Experimental Results and Analysis

4.1. Environment

The experimental environment for analyzing the immersion and VR sickness levels for our walking interactions was configured using Unity 3D 5.3.4f1 (Unity Technologies, San Francisco, CA, USA), Oculus SDK (ovr_unity_utilities 1.3.2, Oculus, City of Irvine, CA, USA), and Leap Motion SDK v.4.1.4 (Leap Motion, Inc., San Francisco, CA, USA). The portable walking simulator for step three was designed based on the model proposed by Lee et al. [16]. The specifications of the PC running the integrated development environment and experiments were an Intel Core i7-6700 processor (Intel Corporation, Santa Clara, CA, USA), 16 GB RAM, and a Geforce 1080 GPU (NVIDIA, Santa Clara, CA, USA).
First, VR scenes were configured for performing the three-step walking interactions. The proposed walking interaction methods were applied to existing game scenes in order to improve the objectivity of the experimental results. In order to facilitate diversity in the experiments, a total of three VR scenes were used: a low-poly landscape from Stoolfeather Games [32], a cartoon town from Manufactura K4 [33], and a realistic nature environment from Zatylny [34]. Figure 6 presents the VR scenes used in this experiment (Supplementary Video S1). The field of view (FOV) of the HMD camera was set to 90 in all virtual scenes.
A few important factors are that the frames per second (FPS) during the simulation must be at least 75, and the total polygon count of the rendered scenes must not exceed two million bytes (M) in order to produce a VR application that can facilitate the experiments for the proposed walking interaction methods. If these conditions are not met, VR sickness may occur as a result of scene transmission to the HMD even without any walking interaction. Table 1 outlines the analysis results of the technical performance of the three VR scenes used in this experiment. There were no problems in meeting the conditions. In the case of the nature scene, the polygon count was relatively high due to the grass on the ground, but it was found to be usable an experimental scene through prior verification.
Immersion and VR sickness levels were tested and analyzed by applying the three step walking interactions to each VR scene. For this experiment, data was collected through a survey of random users. In order to obtain more reliable results, the Wilcoxon test was used for the questionnaire results in both the presence questionnaire and simulator sickness questionnaire. There were 20 participants in total. They were between 20 and 35 years of age, with 10 males (25.5 years on average) and 10 females (23.4 years on average selected at random. Participants were university students and researchers in software-related majors, and were selected through an application process. The experimental environment was set up in a facility large enough to accommodate the march-in-place interaction (Figure 7). Based on the assumption that the three-step interaction ordering could influence immersion and presence, the experimental group was divided into two subgroups. The first experimental group experienced the VR scenes the order of step one (gamepad G), step two (hand interface H), and step three (simulator S). The second experimental group used the reverse order, starting with step three and ending with step one. The sequences were separated because the order of interactions may affect their perceived effectiveness. The subgroups consisted of 5 males and 5 females, and were divided such that the average ages of the two subgroups were similar.
The three-step walking interaction in this study uses motions that are performed at approximately the same speed. However, the speed may feel different because each interaction uses a different input method. Furthermore, there is a temporal gap between the speed of the input and the speed of walking. Additionally, the participants had different heights, but the point of view is set at the same height in all the virtual scenes.

4.2. Immersion

The first experiment is a comparative analysis of the immersion of the proposed walking interactions. First, participants navigated each of the three scenes in their assigned order for the three interactions. The following question was then asked and answered a total of nine times (three scenes with three interaction methods: “Has the interaction method that you used provided sufficient immersion for walking in virtual reality?”). Participants were asked to provide comparison values by focusing on the relationship between walking and immersion. They answered on a scale from 1 to 5 based on what they felt. Figure 8 displays the results. The three-step walking interaction was applied to each of the three virtual scenes with a focus on different concepts and the response values were recorded. The results of the participants who experienced the walking interactions from step 1 to 3 and from step 3 to 1 were analyzed separately. The results confirmed that the change between virtual scenes with different ground patterns did not have a significant effect on the immersion levels for walking. When the mean values of the walking interaction in each step are compared, as shown in Figure 8a,b, the differences in values are smaller than 0.05, and the graph patterns do not change significantly. It is notable that the ranking for immersion was recorded in descending order for the portable walking simulator, hand interface, and gamepad regardless of the order of the virtual scenes. Furthermore, the portable walking simulator showed greater differences in values than those for the hand interface and gamepad. This result suggests that it is critical to provide an interaction method that allows users to use their legs in order to immerse them in the walking process in virtual scenes. Finally, the experimental group was divided into two subgroups under the assumption that the ordering of the proposed three-step interaction could influence immersion. The left graphs display the results of the participants who experienced the interaction from step 1 to 3 and the right graphs display the results of the participants who experienced the interaction in reverse order. As can be seen from the graphs and mean values, the ordering of experiences had no effect on the overall results—although there were differences in the subjective experiences of the participants.
The Wilcoxon test was performed to determine if the three-step interaction provided different levels of satisfaction to users based on the survey data. The Wilcoxon test is a non-parametric statistical hypothesis test proposed by Wilcoxon [35]. This test is used to determine the significance or insignificance of the differences between two related samples. Thus, in order to verify the existence of differences between the three-step interactions, the probability of significance (p-value) was calculated by using the Wilcoxon test with the survey data as input values. Table 2 outlines the results.
The difference between the gamepad and hand interface had a p-value of 5.2763544 × 10 2 , which is slightly high than the significance level threshold (0.05). Thus, the null hypothesis is not rejected. In other words, even though the hand interface provides better immersion than the gamepad, the difference does not guarantee that the two interaction methods are significantly different. However, the p-values for the differences between the walking simulator and the other two interaction methods were 1.605174496 × 10 11 and 1.594362280 × 10 11 , which are both smaller than the significance level threshold. Thus, the null hypothesis is rejected. Consequently, it is statistically provable that the interaction using the march-in-place motion provided better immersion. Therefore, methods or devices that directly use the legs should be used to provide walking interactions that are the most similar to the real experience of walking.

4.3. Presence

In the second survey experiment, the presence of the walking interactions was analyzed. The immersion experiment analyzed whether or not the interaction provided walking motions to users with a high level of realism. The second experiment verifies whether or not the virtual scenes in the experiment and the overall environment (consisting of virtual characters) provide satisfactory presence to users. Participants are divided into the same groups as in the previous experiment. For this survey experiment, the presence questionnaire proposed by Witmer et al. [36,37] was used. This questionnaire consists of 24 questions that analyze the presence users feel in a virtual environment. Additionally, each item is used to analyze detailed factors of presence. Items 3, 4, 5, 6, 7, 10, and 13 are used to analyze realism; items 1, 2, 8, and 9 are used to analyze the possibility of an act; items 14, 17, and 18 are used to analyze the quality of the interface; items 11, 12, and 19 are used to analyze the possibility of examining the virtual reality; items 15 and 16 are used to analyze the self-evaluation of the performance; items 20, 21, and 22 are used to analyze sounds; and items 23 and 24 are used to analyze haptic touch. Users are asked to enter a value between 1 and 7 for each question, with values closer to 7 indicating that they agree with the presented question. In this study, presence questionnaire results were collected from all three scenes in the three-step walking interaction. However, our study only used 19 items, excluding items 20–24. This is because sounds and haptic touch were not considered in our tests. Table 3 outlines the results. As a result of analyzing presence in the three-step walking interaction, the walking simulator using march-in-place has the highest ratings for satisfaction (118.3, 119.3, 117.9), followed by the hand interface (114.6, 115.3, 114.2), with the gamepad coming in last (111.5, 112.2, 111.1). As a result of analyzing the raw data obtained by dividing the total score by the number of items (19), we can see that the walking simulator has relatively high presence values (6.23, 6.28, 6.20). In particular, the walking simulator using march-in-place received high values (6.39, 6.49, 6.29) for the realism questions (3, 5, 7, and 10). Thus, we conclude that users experiencing the virtual environment with their own legs felt the highest levels of presence and realism. The three VR scenes used did not yield significant differences in presence.

4.4. VR Sickness

The last experiment focused on VR sickness. It is critical for walking interactions to prevent VR sickness in VR applications and to provide a sense of immersion to users. If VR devices are designed only for immersion, with no consideration for user VR sickness, only short sessions can be safely experienced. Thus, we conducted experiments to analyze the effects of the proposed walking interaction methods on VR sickness. We collected data using a simulator sickness questionnaire (SSQ) [38,39]. The SSQ consists of 16 questions and allows the analysis and measurement of simulator sickness from various perspectives. Each question was derived from the results of various experiments on the effects of simulators and interaction methods used in VR applications, and the symptoms of users. Participants chose one of four answers (none, slight, moderate, and severe) for each of the 16 questions. None, slight, moderate, and severe are converted to 0, 1, 2, and 3 points, respectively. The total score of each participant is then calculated after applying a weight value. The scores of all participants are statistically analyzed to determine the level of VR sickness. In particular, the SSQ can analyze sickness in detail by classifying the questions. Items 1, 6, 7, 8, 9, 15, and 16 indicate nausea; items 1, 2, 3, 4, 5, 9, and 11 indicate oculomotor sickness; and items 5, 8, 10, 11, 12, 13, and 14 indicate disorientation sickness. In general, sickness is analyzed based on the sum of the weighted (nausea: 9.54, oculomotor: 7.58, disorientation: 13.92) detail items (nausea, oculomotor, and disorientation). For objective analysis of symptoms, the raw data excluding weights can also be used [40]. In this study, both methods were applied to derive results. Table 4 displays the general analysis results for SSQ. The VR sickness results for each interaction based on the raw data were 6.45, 6.20, and 5.50 for gamepad control, hand control, and march-in-place control, respectively. When these values are divided by the total number of items (16) in the questionnaire, the results are 0.4, 0.39, and 0.34. When 0 is interpreted as no VR sickness and 1 is interpreted as slight VR sickness, these values suggest that the three-step walking interaction causes very little VR sickness on average. Furthermore, the walking interaction with march-in-place control caused the lowest levels of VR sickness (28.80, 5.50 < 31.98, 6.45 (G), 31.04, 6.20 (H)). The reasons for these results were analyzed by using the detailed items. In the case of the gamepad or hand interfaces, when there is an input, the camera moves toward the gaze direction without swaying. In the case of the walking simulator, however, when the user marches in place, the reaction is conveyed directly to the head and the user sees a reflection of this motion in the virtual scenes. This helps users feel like they are walking and helps them to accurately perceive the direction of movement. This had the greatest influence on the disorientation items. Furthermore, it was found that a longer period of walking resulted in a greater chance of VR sickness. In the case of the gamepad, 6.45 became 7.15 after 180 s and 8.05 after 300 s. Thus, the difference in value increased from 0.7 to 0.9. In the case of the hand interface, the value change ranged from 0.55 to 0.85, which is lower than that of the gamepad. In the case of the march-in-place interaction, the value change was the lowest, ranging from 0.4 to 0.5. This result suggests that user walking with their own legs experienced lower VR sickness when participating in long sessions in a VR environment. Finally, the raw data for all participants was plotted on graphs, which are shown in Figure 9. There was a wide range in the levels of VR sickness indicated by users, from a minimum of 0 to a maximum of 22. However, participants whose total score was 16 or higher (mean score is 1.0 (slight) or higher) only accounted for approximately 10% of the participants. In other words, 90% of the participants in this experiment felt small levels VR sickness while experiencing the proposed walking interactions.

5. Limitation and Discussion

The reasons for constructing three steps of walking interactions in this study are as follows. The first assumption is the exclusion of expensive devices that cannot be popularized. For this reason, a gamepad—the most widely-used input device for interactive contents—was selected as the first step interaction. In the second step, a hand interface was suggested under the assumption that users exist in the most free condition with no device in a virtual space, because the hand is the body part that is generally used in interaction. Nevertheless, because the interaction to be analyzed in this study is walking, the march-in-place detection method suggested by Lee et al. [16] was set as the last step interaction using legs, which are the body parts used for walking. The results showed that interaction using legs was the key to both the immersion and VR sickness of walking. Therefore, even expensive devices need an experiment to compare with treadmill-type VR devices, which allow large movements in a limited space. In the future, the immersion and VR sickness of walking interaction should be analyzed in more detail through comparative experiments with various VR devices and techniques that control walking by moving actual legs. In addition, a study by Usoh et al. [21] confirmed that real walking provided better presence. This study considered walking interaction that can be applied to VR applications, rather than just comparing presence and sickness. For this reason, input devices are often used in VR applications, hands are often used for interactions in VR, and a low-cost simulator that can directly express walking with a simple principle is used. However, the possibility of using these devices in the future should be searched through comparative experiments with various studies such as a study that provides walking experience with no motion [14].

6. Conclusions

In this study, experiments were conducted to analyze methods to enhance the immersion and minimize the VR sickness of users, focusing on walking among various interactions required for VR applications. Under the assumption that expensive devices that cannot be popularized should not be used, simple three-step interactions that do not burden users were proposed. The three-step interactions were constructed in a process of controlling walking from indirect to direct methods. The first step was interaction using a gamepad, which is often used in 3D interactive contents such as games. The second step was an interface using hands, which are frequently used in interactions. The last step was to detect march-in-place and match it to the movements of a VR character, which is the interaction method that is closest to walking. After setting commercial VR scenes in line with the technical performance requirements of VR applications, the three step interactions were applied to each scene. In addition, survey experiments were conducted to analyze immersion, presence, and VR sickness for general users. For immersion, answers to the question of whether users feel as if they are actually walking in an immersive environment were recorded and analyzed through the Wilcoxon test. As a result, no values were obtained that could show the difference in immersion between the gamepad and hand interface, and only the fact that march-in-place provided better immersion was verified. The experiment on presence analyzed three types of virtual scenes using the presence questionnaire by inputting three steps of interaction. The results showed that march-in-place provided better interaction, which is similar to the results of the immersion experiment; in addition, satisfaction about presence was generally high. Lastly, VR sickness was surveyed using SSQ. What is notable here is that the method of implementing walking interaction with the users’ legs reflected even the swaying of the head in the virtual environment, and, during walking, it sent the visual information to the user, resulting in lower VR sickness. Consequently, providing a walking method that is close to the walking motion using one’s own legs can prevent VR sickness as well as improve immersion and presence. Thus, walking interactions could provide a highly satisfactory VR environment for users if march-in-place detection devices for mobile phones that can be popularized are developed steadily in accordance with the concept of hardware that can accurately detect hand motions through low-priced devices such as Leap Motion.

Supplementary Materials

The following are available online at www.mdpi.com/2073-8994/9/5/78/s1, Video S1: A Study on Immersion and VR sickness in Walking Interaction for Immersive Virtual Reality Applications.

Acknowledgments

This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education(No. NRF-2014R1A1A2055834).

Author Contributions

Jiwon Lee, Mingyu Kim and Jinmo Kim conceived and designed the experiments; Jiwon Lee and Mingyu Kim performed the experiments; Jiwon Lee and Jinmo Kim analyzed the data; Jinmo Kim wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Slater, M. Presence and the sixth sense. Presence-Teleoper. Virtual Environ. 2002, 11, 435–439. [Google Scholar] [CrossRef]
  2. Sanchez-Vives, M.V.; Slater, M. From presence to consciousness through virtual reality. Nat. Rev. Neurosci. 2005, 6, 332–339. [Google Scholar] [CrossRef] [PubMed]
  3. Slater, M.; Sanchez-Vives, M.V. Enhancing Our Lives with Immersive Virtual Reality. Front. Robot. AI 2016, 3. [Google Scholar] [CrossRef]
  4. Tanaka, N.; Takagi, H. Virtual Reality Environment Design of Managing Both Presence and Virtual Reality Sickness. J. Physiol. Anthropol. Appl. Hum. Sci. 2004, 23, 313–317. [Google Scholar] [CrossRef]
  5. Sutherland, I.E. A Head-mounted Three Dimensional Display. In Proceedings of the Fall Joint Computer Conference, (Part I AFIPS’68), San Francisco, CA, USA, 9–11 December 1968; ACM: New York, NY, USA, 1968; pp. 757–764. [Google Scholar]
  6. Schissler, C.; Nicholls, A.; Mehra, R. Efficient HRTF-based Spatial Audio for Area and Volumetric Sources. IEEE Trans. Vis. Comput. Graph. 2016, 22, 1356–1366. [Google Scholar] [CrossRef] [PubMed]
  7. Serafin, S.; Turchet, L.; Nordahl, R.; Dimitrov, S.; Berrezag, A.; Hayward, V. Identification of virtual grounds using virtual reality haptic shoes and sound synthesis. In Proceedings of the Eurohaptics 2010 Special Symposium, 1st ed.; Nijholt, A., Dijk, E.O., Lemmens, P.M.C., Luitjens, S., Eds.; University of Twente: Enschede, The Netherlands, 2010; Volume 10-01, pp. 61–70. [Google Scholar]
  8. Seinfeld, S.; Bergstrom, I.; Pomes, A.; Arroyo-Palacios, J.; Vico, F.; Slater, M.; Sanchez-Vives, M.V. Influence of Music on Anxiety Induced by Fear of Heights in Virtual Reality. Front. Psychol. 2016, 6, 1969. [Google Scholar] [CrossRef] [PubMed]
  9. Zhao, W.; Chai, J.; Xu, Y.Q. Combining Marker-based Mocap and RGB-D Camera for Acquiring High-fidelity Hand Motion Data. In Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation; Eurographics Association (SCA’12), Aire-la-Ville, Switzerland, 11–13 July 2012; pp. 33–42. [Google Scholar]
  10. Tompson, J.; Stein, M.; Lecun, Y.; Perlin, K. Real-Time Continuous Pose Recovery of Human Hands Using Convolutional Networks. ACM Trans. Graph. 2014, 33, 169. [Google Scholar] [CrossRef]
  11. Nordahl, R.; Berrezag, A.; Dimitrov, S.; Turchet, L.; Hayward, V.; Serafin, S. Preliminary Experiment Combining Virtual Reality Haptic Shoes and Audio Synthesis. In Proceedings of the 2010 International Conference on Haptics—Generating and Perceiving Tangible Sensations: Part II (EuroHaptics’10), Amsterdam, The Netherlands, 8–10 July 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 123–129. [Google Scholar]
  12. Cheng, L.P.; Roumen, T.; Rantzsch, H.; Köhler, S.; Schmidt, P.; Kovacs, R.; Jasper, J.; Kemper, J.; Baudisch, P. TurkDeck: Physical Virtual Reality Based on People. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST’15), Charlotte, NC, USA, 11–15 November 2015; ACM: New York, NY, USA, 2015; pp. 417–426. [Google Scholar]
  13. Slater, M.; Sanchez-Vives, M.V. Transcending the Self in Immersive Virtual Reality. IEEE Comput. 2014, 47, 24–30. [Google Scholar] [CrossRef]
  14. Kokkinara, E.; Kilteni, K.; Blom, K.J.; Slater, M. First Person Perspective of Seated Participants over a Walking Virtual Body Leads to Illusory Agency Over the Walking. Sci. Rep. 2016, 6, 28879. [Google Scholar] [CrossRef] [PubMed]
  15. Antley, A.; Slater, M. The Effect on Lower Spine Muscle Activation of Walking on a Narrow Beam in Virtual Reality. IEEE Trans. Vis. Comput. Graph. 2011, 17, 255–259. [Google Scholar] [CrossRef] [PubMed]
  16. Lee, J.; Jeong, K.; Kim, J. MAVE: Maze-based immersive virtual environment for new presence and experience. Comput. Anim. Virtual Worlds 2017. [Google Scholar] [CrossRef]
  17. Hoffman, H.G. Physically touching virtual objects using tactile augmentation enhances the realism of virtual environments. In Proceedings of the IEEE 1998 Virtual Reality Annual International Symposium, Atlanta, GA, USA, 14–18 March 1998; IEEE Computer Society: Washington, DC, USA, 1998; pp. 59–63. [Google Scholar]
  18. Carvalheiro, C.; Nóbrega, R.; da Silva, H.; Rodrigues, R. User Redirection and Direct Haptics in Virtual Environments. In Proceedings of the 2016 ACM on Multimedia Conference (MM’16), Amsterdam, The Netherlands, 15–19 October 2016; ACM: New York, NY, USA, 2016; pp. 1146–1155. [Google Scholar]
  19. Burdea, G.C. Haptics issues in virtual environments. In Proceedings of the Computer Graphics International 2000, Geneva, Switzerland, 19–24 June 2000; IEEE Computer Society: Washington, DC, USA, 2000; pp. 295–302. [Google Scholar]
  20. Schorr, S.B.; Okamura, A. Three-dimensional skin deformation as force substitution: Wearable device design and performance during haptic exploration of virtual environments. IEEE Trans. Haptics 2017. [Google Scholar] [CrossRef] [PubMed]
  21. Usoh, M.; Arthur, K.; Whitton, M.C.; Bastos, R.; Steed, A.; Slater, M.; Brooks, F.P., Jr. Walking > Walking-in-place > Flying, in Virtual Environments. In Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’99), Los Angeles, California, USA, 8–13 August 1999; ACM Press/Addison-Wesley Publishing Co.: New York, NY, USA, 1999; pp. 359–364. [Google Scholar]
  22. Hayward, V.; Astley, O.R.; Cruz-Hernandez, M.; Grant, D.; Robles-De-La-Torre, G. Haptic interfaces and devices. Sens. Rev. 2004, 24, 16–29. [Google Scholar] [CrossRef]
  23. Yano, H.; Miyamoto, Y.; Iwata, H. Haptic Interface for Perceiving Remote Object Using a Laser Range Finder. In Proceedings of the World Haptics 2009—Third Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems (WHC’09), Salt Lake City, UT, USA, 18–20 March 2009; IEEE Computer Society: Washington, DC, USA, 2009; pp. 196–201. [Google Scholar]
  24. Metcalf, C.D.; Notley, S.V.; Chappell, P.H.; Burridge, J.H.; Yule, V.T. Validation and Application of a Computational Model for Wrist and Hand Movements Using Surface Markers. IEEE Trans. Biomed. Eng. 2008, 55, 1199–1210. [Google Scholar] [CrossRef] [PubMed]
  25. Stollenwerk, K.; Vögele, A.; Sarlette, R.; Krüger, B.; Hinkenjann, A.; Klein, R. Evaluation of Markers for Optical Hand Motion Capture. In Proceedings of the 13 Workshop Virtuelle Realität und Augmented Reality der GI-Fachgruppe VR/AR, Bielefeld, Germany, 8–9 September 2016; Shaker Verlag: Aachen, Germany, 2016. [Google Scholar]
  26. Visell, Y.; Cooperstock, J.R.; Giordano, B.L.; Franinovic, K.; Law, A.; Mcadams, S.; Jathal, K.; Fontana, F. A Vibrotactile Device for Display of Virtual Ground Materials in Walking. In Proceedings of the 6th International Conference on Haptics: Perception, Devices and Scenarios (EuroHaptics’08), Madrid, Spain, 11–13 June 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 420–426. [Google Scholar]
  27. Sreng, J.; Bergez, F.; Legarrec, J.; Lécuyer, A.; Andriot, C. Using an Event-based Approach to Improve the Multimodal Rendering of 6DOF Virtual Contact. In Proceedings of the 2007 ACM Symposium on Virtual Reality Software and Technology (VRST’07), Newport Beach, CA, USA, 5–7 November 2007; ACM: New York, NY, USA, 2007; pp. 165–173. [Google Scholar]
  28. Cheng, L.P.; Lühne, P.; Lopes, P.; Sterz, C.; Baudisch, P. Haptic Turk: A Motion Platform Based on People. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’14), Toronto, Ontario, Canada, 26 April–1 May 2014; ACM: New York, NY, USA, 2014; pp. 3463–3472. [Google Scholar]
  29. Vasylevska, K.; Kaufmann, H.; Bolas, M.; Suma, E.A. Flexible spaces: Dynamic layout generation for infinite walking in virtual environments. In Proceedings of the 2013 IEEE Symposium on 3D User Interfaces (3DUI), Orlando, FL, USA, 16–17 March 2013; IEEE Computer Society: Washington, DC, USA, 2013; pp. 39–42. [Google Scholar]
  30. Slater, M.; Usoh, M.; Steed, A. Taking steps: The influence of a walking technique on presence in virtual reality. ACM Trans. Comput.-Hum. Interact. 1995, 2, 201–219. [Google Scholar] [CrossRef]
  31. Pfurtscheller, G.; Leeb, R.; Keinrath, C.; Friedman, D.; Neuper, C.; Guger, C.; Slater, M. Walking from thought. Brain Res. 2006, 1071, 145–152. [Google Scholar] [CrossRef] [PubMed]
  32. Stoolfeather-Games. Low-Poly Landscape. 2016. Available online: http://www.stoolfeather.com/low-poly-series-landscape.html (accessed on 17 January 2017).
  33. Manufactura-K4. Cartoon Town. 2014. Available online: https://www.assetstore.unity3d.com/en/#!/content/17254 (accessed on 17 January 2017).
  34. Zatylny, P. Realistic Nature Environment. 2016. Available online: https://www.assetstore.unity3d.com/kr/#!/content/58429 (accessed on 17 January 2017).
  35. Wilcoxon, F. Individual Comparisons by Ranking Methods. Biom. Bull. 1945, 1, 80–83. [Google Scholar] [CrossRef]
  36. Witmer, B.G.; Jerome, C.J.; Singer, M.J. The Factor Structure of the Presence Questionnaire. Presence Teleoper. Virtual Environ. 2005, 14, 298–312. [Google Scholar] [CrossRef]
  37. Witmer, B.G.; Singer, M.J. Presence Questionnaire. 2004. Available online: http://w3.uqo.ca/cyberpsy/docs/qaires/pres/PQ_va.pdf (accessed on 20 March 2017).
  38. Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. Int. J. Aviat. Psychol. 1993, 3, 203–220. [Google Scholar] [CrossRef]
  39. Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator Sickness Questionnaire. 2004. Available online: http://w3.uqo.ca/cyberpsy/docs/qaires/ssq/SSQ_va.pdf (accessed on 18 Febraury 2017).
  40. Bouchard, S.; St-Jacques, J.; Renaud, P.; Wiederhold, B.K. Side effects of immersions in virtual reality for people suffering from anxiety disorders. J. Cyberther. Rehabil. 2009, 2, 127–137. [Google Scholar]
Figure 1. System overview of the proposed three-step walking interaction.
Figure 1. System overview of the proposed three-step walking interaction.
Symmetry 09 00078 g001
Figure 2. Setting the input properties of the gamepad and the properties of the character controller of the Oculus head mounted display (HMD) (Oculus, City of Irvine, CA, USA) in the integrated development environment of the Unity 3D engine (Unity Technologies, San Francisco, CA, USA).
Figure 2. Setting the input properties of the gamepad and the properties of the character controller of the Oculus head mounted display (HMD) (Oculus, City of Irvine, CA, USA) in the integrated development environment of the Unity 3D engine (Unity Technologies, San Francisco, CA, USA).
Symmetry 09 00078 g002
Figure 3. Hand gesture results recognized based on the proposed hand interface.
Figure 3. Hand gesture results recognized based on the proposed hand interface.
Symmetry 09 00078 g003
Figure 4. March-in-place detection process through gyro sensor.
Figure 4. March-in-place detection process through gyro sensor.
Symmetry 09 00078 g004
Figure 5. System components and roles of the portable walking simulator.
Figure 5. System components and roles of the portable walking simulator.
Symmetry 09 00078 g005
Figure 6. Virtual reality (VR) scenes for experiments of the proposed walking interaction methods (result of scene rendering according to the location and viewpoint of character): (a) low-poly landscape; (b) cartoon town; (c) realistic nature environment.
Figure 6. Virtual reality (VR) scenes for experiments of the proposed walking interaction methods (result of scene rendering according to the location and viewpoint of character): (a) low-poly landscape; (b) cartoon town; (c) realistic nature environment.
Symmetry 09 00078 g006
Figure 7. Walking interaction experience environment for questionnaire experiment: (a) PC, VR device and walking simulator; (b) Example of gamepad experience; (c) Example of hand interface experience; (d) Example of portable walking simulator experience.
Figure 7. Walking interaction experience environment for questionnaire experiment: (a) PC, VR device and walking simulator; (b) Example of gamepad experience; (c) Example of hand interface experience; (d) Example of portable walking simulator experience.
Symmetry 09 00078 g007
Figure 8. Results of questionnaire experiments on the immersion of the proposed walking interaction methods (survey question: Has the interaction that you used provided sufficient immersion for walking in virtual reality? 1: Not at all; 5: Very much so; left: Experimental group experiencing in the sequence of gamepad → hand interface → walking simulator; right: Experimental group experiencing in the sequence of waling simulator → hand interface → gamepad): (a-1,b-1) Experimental results in VR scene (low-poly landscape); (a-2,b-2) Experimental results in VR scene (cartoon town); (a-3,b-3) Experimental results in VR scene (realistic nature environment).
Figure 8. Results of questionnaire experiments on the immersion of the proposed walking interaction methods (survey question: Has the interaction that you used provided sufficient immersion for walking in virtual reality? 1: Not at all; 5: Very much so; left: Experimental group experiencing in the sequence of gamepad → hand interface → walking simulator; right: Experimental group experiencing in the sequence of waling simulator → hand interface → gamepad): (a-1,b-1) Experimental results in VR scene (low-poly landscape); (a-2,b-2) Experimental results in VR scene (cartoon town); (a-3,b-3) Experimental results in VR scene (realistic nature environment).
Symmetry 09 00078 g008
Figure 9. Statistical analysis results for the simulator sickness questionnaire (SSQ) raw data.
Figure 9. Statistical analysis results for the simulator sickness questionnaire (SSQ) raw data.
Symmetry 09 00078 g009
Table 1. Analysis results of the technical performance of the experimental virtual reality (VR) scenes.
Table 1. Analysis results of the technical performance of the experimental virtual reality (VR) scenes.
Frames per Second (fps)Number of Polygons (Million Byte, M)
MinimumMaximumMeanMinimumMaximumMean
Low-poly Landscape83.1108.388.70.106931.10150.60543
Cartoon Town77.3124.498.70.240530.62520.43125
Nature Environment117.1144.0122.30.846472.15701.15134
Table 2. Wilcoxon test results of the proposed walking interaction methods.
Table 2. Wilcoxon test results of the proposed walking interaction methods.
SamplesSignificance Probability (p-Value)
Gamepad : Hand interface 5.2763544 × 10 2
Gamepad : Walking simulator 1.605174496 × 10 11
Hand interface : Walking simulator 1.594362280 × 10 11
Table 3. Presence questionnaire analysis results for the proposed walking interaction methods (gamepad, G; hand interface, H; and walking simulator, S).
Table 3. Presence questionnaire analysis results for the proposed walking interaction methods (gamepad, G; hand interface, H; and walking simulator, S).
Mean (Raw Data)Standard Deviation (SD)
Low Poly LandscapeCartoon TownNature EnvironmentLow Poly LandscapeCartoon TownNature Environment
TotalG111.5 (5.87)112.2 (5.91)111.1 (5.85)3.443.793.70
H114.6 (6.03)115.3 (6.07)114.2 (6.01)3.353.473.52
S118.3 (6.23)119.3 (6.28)117.9 (6.20)5.335.165.09
RealismG39.0 (5.57)39.7 (5.67)38.6 (5.51)1.842.001.85
H41.5 (5.93)42.2 (6.03)41.1 (5.87)1.501.331.51
S44.7 (6.39)45.4 (6.49)44.0 (6.29)2.242.151.61
Possibility of actG25.5 (6.38)25.5 (6.38)25.5 (6.38)1.021.021.02
H25.2 (6.30)25.2 (6.30)25.2 (6.30)0.980.980.98
S25.1 (6.28)25.1 (6.28)25.1 (6.28)1.511.511.51
Quality of interfaceG17.4 (5.8)17.4 (5.8)17.4 (5.8)1.361.361.36
H17.8 (5.93)17.8 (5.93)17.8 (5.93)1.251.251.25
S18.0 (6.00)18.0 (6.00)18.0 (6.00)1.481.481.48
Possibility of examineG18.2 (6.07)18.2 (6.07)18.2 (6.07)1.241.241.24
H18.2 (6.07)18.2 (6.07)18.2 (6.07)1.251.251.25
S18.3 (6.10)18.6 (6.20)18.6 (6.20)1.271.111.17
Self-evaluation of performanceG11.4 (5.70)11.4 (5.70)11.4 (5.70)1.021.021.02
H11.9 (5.95)11.9 (5.95)11.9 (5.95)0.940.940.94
S12.2 (6.10)12.2 (6.10)12.2 (6.10)1.171.171.17
Table 4. Simulator sickness questionnaire (SSQ) analysis results for the proposed walking interaction methods (gamepad, G; hand interface, H; and walking simulator, S).
Table 4. Simulator sickness questionnaire (SSQ) analysis results for the proposed walking interaction methods (gamepad, G; hand interface, H; and walking simulator, S).
MeanStandard Deviation (SD)MinimumMaximum
Original SSQ (Weighted)
TotalG31.9828.67089.76
H31.0425.23082.28
S28.8022.03074.80
NauseaG13.3614.90038.16
H12.4013.86038.16
S12.8913.92057.24
OculomotorG26.9024.15068.22
H25.3919.95060.64
S24.6417.76060.64
DisorientationG50.1151.220167.04
H50.8147.870167.04
S43.1537.330125.28
Alternative SSQ (Raw Data)
TotalG6.456.10022
H6.205.32019
S5.504.57017
NauseaG1.401.5604
H1.301.4504
S1.301.4906
OculomotorG3.553.19010
H3.352.6308
S3.252.3408
DisorientationG3.603.68012
H3.653.44013
S3.102.6809
After 180 secG7.157.44026
H6.756.58022
S5.905.1121
After 300 secG8.057.59132
H7.606.98128
S6.405.44126
Back to TopTop