Virtual reality (VR) technology enables users to perceive the virtual world as if it were real, and to interact with it by providing various sensations to users in a virtual world implemented using a computer [1
]. Studies on VR technology have been conducted since Ivan Sutherland [3
] developed a head mounted display (HMD) in 1968. VR technology has since been used widely in various industries. For example, Lele [4
] presented an assessment of VR technology for military applications. Parsons and Rizzo [5
] conducted a feasibility assessment for medical treatment of VR technology and Seymour et al. [6
] analyzed the use of VR technology for medical training in the operating room. Portman and Natapov [7
] introduced the application of VR technology for architecture, landscape architecture, and environmental planning. Gotsis et al. [8
] developed a games platform based on VR technology for body exercises and rehabilitation. McMahan et al. [9
] evaluated the effects of fidelity on the user in a complex performance-intensive context using a VR game. Grossard et al. [10
] proposed the use games utilizing VR technology for treating individuals with autism spectrum disorders.
Moreover, VR technology has been intensively studied for human training purposes. For instances, Chenechal et al. [11
] developed a remote guiding system that can help a user by way of two virtual arms controlled by a remote expert in the virtual reality environment. Torres et al. [12
] developed a virtual workspace where workers are trained in welding operation. Anton et al. [13
] proposed telerehabilitation systems that can support physical therapy in virtual environments, and Quail et al. [14
] introduced a digital game within a medical education area, in order to improve engagement, confidence and knowledge for undergraduate students.
Several studies focused on VR technology in the mining industry, thus van Wyk and de Villiers [15
] developed simulation software to indirectly experience workplaces in a VR environment to prevent safety accidents in South African mining sites. Their work enabled workers to effectively learn the hazards at mining sites, and how to cope with accidents, should they occur. In a similar study, Kizil et al. [16
] developed software for training control of equipment in underground mines and ventilation facilities in a VR environment. Orr et al. [17
] developed software for training users on how to respond to underground mine fires. A number of start-up companies have recently launched products that incorporate VR technology into the mining industry. For example, Llamazoo [18
] developed software that enables intuitive development plans by visualizing mining site data in a VR environment, and MOON PATROL VR [19
] developed a system that offers a 360-degree birds-eye view of mining sites in a VR environment. In addition, VR software for preventing safety accidents in mining sites [20
] and for training users on equipment operation [21
] were also launched into the market.
To apply VR technology to the mining industry, it is essential to develop devices to effectively control software. Most of the VR content in the mining industry requires precise control for equipment operation or accident response. In other industries, studies have been conducted on the development of user interface devices that can enhance user immersion and effectively control software in a VR environment [22
]. For example, Kaiser et al. [24
] developed 3D multimodal interaction in immersive virtual reality environments using gesture, language, and gaze. Kok and van Liere [25
] developed a 3D interaction system that enables users to control 3D widgets that are utilized to interact with visualized data using a head tracker, stereo glasses, and a camera in a virtual environment. The developed user interface enables the user to control an object more intuitively with immersion in the virtual reality environment.
In the mining industry, several studies have been conducted on the development of 3D user interfaces (3DUIs) to control VR-based software [28
]. Kim and Choi [28
] developed a 3D user interface based on gesture and hand tracking using a Kinect sensor and a bend-sensing data glove to control mining software in a virtual reality environment; Bednarz et al. [29
] proposed a new interaction approach that can control mining equipment using a touch screen, data glove and spherical dome in an immersive virtual reality environment. However, previous studies have not quantitatively compared the performance of various user interface devices for controlling mining industry software in a VR environment, as they have mainly focused only on the development of 3DUI devices.
In this study, we compared the performance of four user interface devices (a 2D mouse, 2D & 3D mice, a VR controller, a bend-sensing data glove with Kinect sensor) in a VR environment when controlling Kmodstudio [30
], a mining industry software product. We analyzed the total working time, the number of device clicks, and the click accuracy for 10 experimenters performing 3D orebody modeling using four user interface devices in a VR environment. In addition, we conducted a user survey to evaluate the ease of learning, ease of use, immersion and fatigue for each device after the experiment.
2. Materials and Methods
We compared the performance of four user interface devices for controlling mining industry software in a virtual reality (VR) environment. The total working time, number of device clicks and the click accuracy were analyzed for ten experimenters when these experimenters performed 3D orebody modeling, using each device in the VR environment.
Furthermore, we conducted a survey to evaluate the ease of learning, ease of use, immersion and fatigue for each device after the experiment. Table 1
describes the experimental conditions of this study.
2.1. User Interface Device
We used a typical 2D mouse in the experiment (Figure 1
a). The experimenters placed the 2D mouse on a desk and controlled the Kmodstudio using the mouse while sitting on a chair, as per the predefined procedure.
Similar to the 2D mouse experiment, for the 2D & 3D mice, the experimenters placed two devices on the desk and used while them sitting on the chair. Each experimenter controlled Kmodstudio using the 2D mouse with the right hand and the 3D mouse (Figure 1
b) with the left hand. Using the controller, placed at the center of the 3D mouse, the experimenters performed rotation, zoom-in and zoom-out operations in the X, Y, and Z directions.
c shows the VR controller used in this study. The experimenters controlled the position of the mouse cursor while in a standing position by holding the device in both hands and moving the wrist in the VR environment. Click events were implemented using the buttons.
The Kinect sensor & bend-sensing data glove (Figure 1
d) were developed by Kim et al. [28
] as a 3DUI for use in a VR environment. The experimenters controlled the position of the cursor by tracking the position of the hand and then tracking the movement of the arm using the Kinect sensor, which is a 3D depth measurement camera. Kmodstudio runs the commands listed in Table 2
using the bend-sensing data glove, which can measure the degree of the bending of fingers when each finger is bent. The experimenters can use this device while standing.
The experimenters consisted of six males and four females between 22 and 27 years old (average 24.8). All of the experimenters had normal vision, or had their corrected vision to within the normal range. They did not have any physical issues inhibiting the use of the four interface devices, and primarily used their right hand when operating the devices. The experimenters were already familiar with how to use PCs, 2D mice, and KmodStudio. Apart from the 2D mouse, they had never used the three interface devices. They also had no previous experience in controlling software in a VR environment with an HMD. Therefore, before conducting the comparison experiment, we provided the 10 experimenters with 30 min of training in controlling KmodStudio with the HMDs, using each interface device. To prevent the learning effects that may occur when an experimenter uses the interface devices in the same order, the experimenters performed the work using the interface devices in different orders. In addition, in order to prevent accumulation of fatigue that may occur when using an interface repeatedly, rest time was provided after using each interface.
KmodStudio is a software for 3D orebody modeling and mine design; it is widely used in the mining industry [30
]. In this study, we set up a scenario to perform 3D orebody modeling with KmodStudio using the following procedure:
(1) After visualizing data representing an orebody grade in 3D space, the user checks the geometric isotropy of the data. To visualize the data, the menu bar at the top is sequentially clicked. In order to check geometric isotropy, the modeling results are confirmed in real-time by clicking various numerical values in the variogram modeling window, as shown in Figure 2
(2) Perform spatial interpolation of 3D ore grade values using ordinary Kriging (Gaussian process regression). The GetPointSetOKG function is used to perform ordinary Kriging; to use the function, the modeling results of the variogram produced in the previous step are called up, and the numerical value is entered.
(3) After adjusting the range of the X, Y and Z axes to visualize the spatially interpolated data, check the results of the 3D orebody grade estimation by zooming in and out of the screen. To define the range of the X, Y and Z axes, we adjusted the scale bar for the visualization area, as shown in Figure 2
According to the above procedure, we classified unit operations, such as mouse cursor movements and clicks required to control KmodStudio, and made a list of 69 operations. According to this operation list, the experimenters performed the comparison experiment by controlling KmodStudio using the four user interface devices. However, when using the Kinect sensor and the bend-sensing data glove, users can perform multiple unit operations at one time, according to the finger bending motions, as shown in Table 2
; thus, the list was actually composed of 41 operations. Figure 2
shows the major unit operations required for 3D orebody modeling using KmodStudio.
2.4. PC and HMD Device
We conducted the experiment using a desktop PC environment with an Intel Core i9-7940X CPU (Intel, Santa Clara, CA, USA), SAMSUNG 64 GB RAM (Samsung Electronics, Suwon, Korea), and ASUS GTX 1080Ti 11 GB graphics card (ASUS, Beitou, Taiwan). We used a Windows 10 Pro for Workstations as the PC operating system. We used the Oculus Rift (Oculus, Irvine, CA, USA) as an HMD for implementing the VR environment. We used the Virtual Desktop (Oculus, Irvine, CA, USA) software to project the PC screen to the HMD. Additionally, we connected a large monitor to the PC, so that we were able to view the screen that the experimenters viewed through the HMD. Through this, we were able to analyze the working time spent on each unit operation, the number of device clicks, and the click accuracy by recording the process as each experimenter controlled KmodStudio using each of the user interface devices (Figure 3
We conducted a user survey to evaluate the ease of learning, ease of use, immersion and fatigue for the four interface devices. The ease of learning was rated between 1 (very difficult) to 5 (very easy). Ease of use was rated between 1 (very difficult) and 5 (very easy). The level of immersion that users felt in the VR environment when using the devices was rated between 1 (none) to 5 (very high), and the level of fatigue felt when using these devices was rated between 1 (none) to 5 (very high).
3. Experimental Results
shows the cumulative working times of three of the experimenters with the HMD device while performing the 69 unit operations using the four user interface devices in the VR environment. The working times for each user interface device differed slightly for each experimenter. Experimenter No. 1 was able to finish operations the fastest when using the Kinect sensors and the bend-sensing data glove, and spent the longest time when using the 2D & 3D mice. On the other hand, Experimenters No. 2 and No. 3 spent the shortest time when using the 2D mouse. Apart from the VR controller, no significant difference was observed in the working time spent by experimenter No. 2 when using the other three user interface devices. However, for experimenters No. 1 and No. 3, there was a difference in the working time spent on using all four user interface devices.
The difference in the number of device-clicks with fingers while performing operations was relatively small for each experimenter (Figure 5
). All experimenters (Nos. 1–3) recorded the fewest clicks when using the Kinect sensor and the bend-sensing data glove, followed by the 2D mouse, 2D & 3D mice, and the VR controller. The reason why the number of clicks was generally higher when using the VR controller is considered to be because the VR controller is difficult to control precisely.
shows the mean and box plots of the experimental results for the 10 experimenters. The box plots include minimum, median and maximum values, along with the interquartile range (IQR), composed of the first quartile (Q1) and third quartile (Q3). The mean value for the 10 experimenters is denoted by the X symbol. The 2D mouse recorded the shortest total working time with an average of 99.3 s, followed in order by the Kinect sensor and the bend-sensing data glove and the 2D & 3D mice. The VR controller recorded the longest time, taking 30 s longer than the 2D mouse. The Kinect sensor and the bend-sensing data glove took longer than the 2D mouse; however, the difference was only 3.6 s.
The number of device-clicks while performing the operations was the fewest (31.4 clicks) when using the Kinect sensor and the bend-sensing data glove. This is because we used the shortcut command functions according to the finger motions when controlling KmodStudio processes, as shown in Table 1
. This was followed in order by the 2D mouse, the 2D & 3D mice, and the VR controller, which recorded the greatest number of clicks. The reason behind the difference in the number of clicks between user interface devices is that the number of incorrect clicks while performing the operations is also included in the total number of clicks. A greater number of clicks to perform the same operations translates into more incorrect clicks.
The click accuracy refers to the ratio of the number of correct clicks to the total number of clicks. Using the 2D mouse resulted in a 6% higher click accuracy than using the 2D & 3D mice together, and a 5% higher click accuracy than using the Kinect sensor and the bend-sensing data glove. That is, the click accuracies of these three devices were similar. On the other hand, there was a significant difference (of 14%) between the 2D mouse and the VR controller. When using the 2D & 3D mice together, the experimenters tended to have trouble controlling devices with both hands, resulting in incorrect clicks.
In the case of using the Kinect sensor and the bend-sensing data glove, the experimenters often failed to click because they accidentally mis-clicked while bending their fingers or moving their wrists. In particular, for the VR controller, the experimenters often failed to click because the cursor moved a lot, even for slight movements in their wrists.
In terms of total working time, when using the 2D & 3D mice together, the distribution of the experimenters varied. In contrast, when using the Kinect sensor and the bend-sensing data glove, most of the experimenters were close to the average value. The number of device clicks during the total operation showed a large variation between the experimenters using the VR controller, while the other three interfaces showed similar results overall. For click accuracy, various distributions were shown by the experimenters when using the VR Controller. It is considered that the VR controller showed a significant difference in learning level among the experimenters when learning the operation method during the training time (30 min provided for each interface device before starting the experiment). Finally, the Kinect sensor and the bend-sensing data glove had similar a learning level.
shows the results of the survey on ease of learning, ease of use, immersion and fatigue. Regarding the ease of learning, the 2D mouse received the highest score (4.8). This was followed in order by the Kinect sensor and the bend-sensing data glove, the 2D & 3D mice and finally the VR controller. While the 2D mouse did not require any additional learning, the experimenters were considered to have difficulties in rotating objects on the actual software using the 3D mouse joystick when using the 2D & 3D mice together. Regarding the Kinect sensor and the bend-sensing data glove, the experimenters had difficulties in learning the control functions according to the finger bending motions listed in Table 1
Regarding ease of use, the 2D mouse also received the highest score. For the other three user interface devices, ease of use showed similar patterns to ease of learning. However, the score of ease of use for the VR controller were significantly different from those of the other interface devices. This is considered to be because the experimenters had difficulties in controlling the software using the VR controller.
Regarding immersion, the experimenters felt that while performing operations in the VR environment, the Kinect sensor and the bend-sensing data glove received the highest score followed by the VR controller. The 2D mouse and 2D & 3D mice combination received relatively low scores. These two interface devices received low scores for immersion because the experimenters used them on the desk while sitting on a chair, which limited their movements and resulted in a reduced immersion in the VR environment.
In the VR environment, the fatigue that the experimenters felt for each interface device showed similar patterns to immersion. The experimenters showed high fatigue when using the Kinect sensor and the bend-sensing data glove and the VR controller, which they used while standing. It was found that the experimenters felt relatively low fatigue when using the 2D mouse and the 2D & 3D mice while sitting.
The results of the comparison experiment performed in this study show that the 2D mouse recorded the shortest total working time spent performing the operations, and the highest click accuracy. This is considered to be because the experimenters were all familiar with the use of a 2D mouse, but were less familiar with the other three interface devices. However, the 2D mouse made it difficult for the experimenters to freely move as they used the device on the desk, which reduced their immersion in the VR environment. The most important reason for implementing a VR environment comes down to reality and immersion, which enable users to perceive the virtual world as if it were real. In this respect, our experimental results confirm that it is difficult to maximize the advantages of a VR environment with an interface device such as a 2D mouse that limits body movements.
Comparing the two user interface devices (the VR controller and the Kinect sensor and bend-sensing data glove) that provided high user movement freedom and immersion, the Kinect sensor and bend-sensing data glove reduced the total working time by 29.53 s, and yielded a 9.17% higher accuracy as compared with the VR controller. General VR content controlled using VR controllers does not require a high control accuracy, as it selects or moves relatively large objects. However, a relatively high level of control accuracy is required when performing 3D orebody modeling using mining industry software in a VR environment. The results of the comparison experiment in this study reveal that the Kinect sensor and bend-sensing data glove were the most-suitable user interface devices for controlling mining industry software in a VR environment.
In this study, we conducted a performance comparison experiment on four interface devices (a 2D mouse, 2D & 3D mice, a VR controller, a Kinect sensor and bend-sensing data glove) that were able to control Kmodstudio, a mining industry software, in a VR environment. We analyzed the total working time, the number of device clicks, and click accuracy for a total of ten experimenters after they completed operations predefined in Kmodstudio.
We then conducted a survey to evaluate the ease of learning, ease of use, immersion and fatigue for each device. Our findings show that the 2D mouse achieved the best performance in its working time, click accuracy, ease of learning and ease of use, while the Kinect sensor and the bend-sensing data glove received the highest score for immersion and fatigue.
The 2D mouse has many restrictions on overall body movements because the users control it while sitting on a chair. Therefore, given that the goal of implementing a VR environment is to maximize immersion that enables us to perceive the virtual world as if it were real, a 2D mouse that limits body movements is not suitable. The Kinect sensor and the bend-sensing data glove, which could maximize immersion with relatively no restrictions on body movements, showed excellent performance in working time and click accuracy. Thus, it is considered to be a suitable user interface device to control mining industry software in a VR environment.
Currently, 3D user interface devices such as the Kinect sensor and bend-sensing data glove cannot completely replace traditional 2D user interface devices such as keyboards and mice. Although 3D user interface devices are effective in providing a high level of immersion in a VR environment, some operations still require character inputs using keyboards, or precise control using 2D mice to control mining industry software. Nevertheless, as VR technology rapidly advances and various HMD devices are commercialized, it is expected that the market demand for the development and utilization of user interface devices for controlling mining industry software will be even greater.
Although several previous studies have been performed in the development of user interface devices that can enhance user immersion and effectively control software in a VR environment [22
], they did not quantitatively compare the performances of the various user interface devices that are able to control mining industry software. Therefore, our research findings are expected to be a useful reference material for the future development of user interface devices in the mining industry. In future work, the workload with user interface devices needs to be evaluated using standard tests such as NASA-TLX [31