Evaluating Virtual Hand Illusion through Realistic Appearance and Tactile Feedback

: We conducted a virtual reality study to explore virtual hand illusion through three levels of appearance (Appearance dimension: realistic vs. pixelated vs. toon hand appearances) and two levels of tactile feedback (Tactile dimension: no tactile vs. tactile feedback). We instructed our participants to complete a virtual assembly task in this study. Immediately afterward, we asked them to provide self-reported ratings on a survey that captured presence and ﬁve embodiment dimensions (hand ownership, touch sensation, agency and motor control, external appearance, and response to external stimuli). The results of our study indicate that (1) tactile feedback generated a stronger sense of presence, touch sensation, and response to external stimuli; (2) the pixelated hand appearance provided the least hand ownership and external appearance; and (3) in the presence of the pixelated hand, prior virtual reality experience of participants impacted their agency and motor control and their response to external stimuli ratings. This paper discusses our ﬁndings and provides design considerations for virtual reality applications with respect to the realistic appearance of virtual hands and tactile


Introduction
Hands are vital for daily life tasks, and the same applies to virtual reality (VR). Virtual hands are essential for both forceful and manipulation tasks [1]. In previous VR studies, the inclusion of virtual hands was proven to provide better task performance for object interaction [2]. Moreover, the existence of virtual hands improves visual cognition, due to increases in short-term memory [3], and enhances cognitive control [4]. In modern VR applications, the integration of virtual hands has become an essential design principle [5].
Nowadays, VR technologies allow users to experience a realistic, immersive environment while observing and feeling their virtual body or body parts. In addition, users can use their VR controllers to interact with virtual objects. Studies in the past demonstrated the importance of virtual hand appearance, indicating that with the existence of virtual hand models, the VR environment became more appealing to users [6][7][8]. Furthermore, a more realistic hand appearance style generates a stronger sense of presence for VR users [9,10] when compared to either mannequin hands or robot hands, or simple geometry blocks.
VR developers can also import somatosensory cues in VR by activating tactile feedback for the controllers. A previously conducted study showed that vibrotactile feedback enhances the sense of immersion and interactivity for VR users [11]. Moreover, recent studies have shown that the vibrotactile feedback of VR controllers could increase the perceived realism of the appearance of the virtual hands [12], as well as the sense of presence and involvement [13] of VR users.
"Virtual hand illusion" (VHI) describes the illusion of ownership of virtual hands [14]. It refers to the observation that VR users perceive "body ownership" if a virtual hand moves or senses feedback in a virtual environment. VHI studies have explored the effects of different hand model geometries (e.g., realistic hand shape, mannequin hand shape, Figure 1. The three different hand appearances we examined in this study (from left to right: realistic, pixelated, and toon hand appearance).
Note that shaders are widely used in games to provide not only detailed rendering (e.g., shadows, lighting, and landscape) [24], but also to generate more realistic visual effects (e.g., sprites, particles, and texture gradients) in virtual environments [25,26]. We think that understanding the impact of the realistic appearance of virtual hands will improve our understanding regarding VHI and thus provide more insights to researchers and VR developers. To do so, and based on our study design, we aim to answer the following research questions: This paper's structure is as follows: Section 2 presents work related to our study. Section 3 provides details of the methodology followed. Section 4 presents the results obtained. In Section 5, we discuss our findings, along with our limitations and design considerations. Finally, we discuss our conclusions and future work directions in Section 6.

Related Work
Embodiment, the sense of substituting a user's body with a virtual avatar into a firstperson view [27], has been studied widely in projects exploring self-motion artifacts [28], visuomotor calibration [29], tactile sensation [30], and external appearance [31]. Previously published research has revealed various factors that can impact embodiment, such as the location of the body, body ownership, agency and motor control, and external appearance [32]. Previous studies have indicated that a virtual avatar in a similar location to the participant [33], agency over the virtual body (such as the synchronous movement of body parts) [34], and both look-alike self-avatars and self-avatars of different genders [35] and racial groups [36] enhance the illusion of embodiment. In summary, participants with a collocated body will experience embodiment illusion, and appearance and agency can enhance that illusion [32].
The virtual hand has become an essential topic in understanding embodiment since it is the most used and seen body part in a first-person view in a virtual environment. Users can perceive their virtual hands using either explicit (actively assessing body size and shape, e.g., "How long is my hand?") [37] or implicit methods (finding landmarks on specific body parts, e.g., "Where is my wrist?") [38]. In VR, these methods correspond to visual and haptic methods [39]. Studies in VR have found both ways to be effective, by providing a realistic hand appearance for VR hand models [9,15,40] and tactile feedback for participants to locate their hands and landmarks [12].
With the development of different input devices, such as finger-tracking gloves and motion controllers, VR hands can now achieve a high level of interaction [41]. In designing interaction techniques, previously conducted studies compared various input methods. Moehring and Froehlich [42] compared finger-based and controller-based control in both head-mounted display (HMD) and cave automatic virtual environment (CAVE) setups. They found that finger-based interaction was preferred, but controller-based interaction had better performance [42]. Lin and Schulze compared grasping gesture and controller manipulation for object interaction [43]. They found that participants favored gestures but felt more reliable using VR controllers [43]. Note that finger-and gesture-based interaction caused uncertainties due to limited capabilities and reliability when compared with controller buttons [43,44].
One crucial factor commonly studied with virtual hands is tactile feedback, which we perceive through our skin. As mentioned before, the tactile sensation is an essential factor contributing to embodiment [30]. Therefore, understanding the perception of tactile feedback has become crucial for VR studies. A previously conducted study showed that different sensation areas have different sensitivity thresholds for tactile feedback frequencies [45]. Moreover, a previously published study indicated that the sense of tactile feedback on specific body parts, such as the elbow and wrist, could produce acute spatial correlation [46]. On that basis, Gaudeni et al. [47] successfully relocated sensed body parts using a haptic ring to deliver fingertip tactile information to a different hand location.
Modern approaches have found a significant effect of vibrotactile feedback on embodiment and VHI in VR studies for various tasks [48][49][50]. Tactile feedback activated on either the whole body or body parts improved users' presence in immersive environments for rehabilitation [51], training [52], gaming [53], and other close-contact interactions with virtual characters [54][55][56]. The studies conducted by Gusterstam et al. [57] and Fossataro et al. [58] indicated that the tactile feedback of a virtual hand increases an individual's sense of ownership of hands and arms. Furthermore, Frohner et al. [48] concluded that vibrotactile feedback fostered hand embodiment. In a different approach, Moore et al. [49] explored the effect of tactile feedback patterns (natural, natural plus local vibratory, local vibratory, proximal vibratory, and no tactile feedback) on VHI. Although Moore et al. [49] did not find significant differences, all tactile feedback patterns generated VHI.
Besides VHI studies conducted in virtual environments, many studies have explored hand ownership in the real world. In rubber hand illusion (RHI) studies, having a synchronous stroke on both the real and virtual limb was proven effective in generating a sense of ownership over the fake limb [59,60]. In a later study, Tsakiris and Haggard [61] revisited the RHI study and found that the rubber hand illusion resulted from the adaption of the mislocalization of the simulated body part. They suggested that the bottom-up building process of visuotactile correlation was critical to the illusion [61]. Moreover, the appear-ance of the fake limb also impacted the RHI [62,63]. In a study with different shapes of limbs (four paper-cut cardboard limbs and a realistic rubber hand), the realistic 3D shape generated the strongest ownership. Therefore, the body model played an essential role in maintaining a coherent sense of one's body [62].
Based on RHI and VHI studies, recent virtual hand illusion studies have focused on exploring elements that could affect the body ownership of virtual hands in virtual environments. Previous studies have found that multisensory integration could generate a sense of hand ownership [59]. In setups similar to the RHI studies, VR studies found significant effects of virtual hand appearance [9,14,15,19]. A realistic hand appearance generated the strongest VHI compared with other hand models [9,15]. Pyasik et al. [64] mentioned that a detailed hand appearance could be an essential factor contributing to VHI.
Another direction of VHI studies was the relationship between agency and motor control. Sanchize-Vives et al. [65] asserted that synchrony between visual and proprioceptive information induces an illusion of ownership. Brugada-Ramentol [66] verified this by finding that incongruent limb control decreases the sense of ownership. Last, other studies found that a high physical load alongside multisensory integration with free movements for virtual hands boosts implicit agency and improves VHI [67,68].
Although there are many studies on the effect of different hand models, little knowledge is available to understand the impact of a hand's realistic appearance on VHI. A previous study showed that appearance fidelity was highly related to 3D rendering quality [69], which is considered an important design element, similar to audio and video quality [70,71]. Rendering style was the most popular technique to determine the rendering output of a 3D environment [24]. Many VR studies used different rendering styles to explore the impact of a realistic appearance [69,72,73]. In conclusion, a realistic appearance had a significant effect on the quality of experience and task completion time [69], change in emotional state [72], and perception of human expression [73]. It is necessary to explore the realistic appearances of virtual hands to understand the potential effects on VHI. In our study, we decided to explore the differences in the rendering styles of a realistic hand model. We decided to only use a single hand model (a realistic hand model) in our experiment across all conditions because we wanted only a single appearance manipulation to be present at a time. Thus, this study utilizes knowledge from previously published research and builds upon it. We conducted a 3 (Appearance) × 2 (Tactile) study to determine the key factors contributing to the sense of presence and the five abovementioned body ownership dimensions. Our results will contribute to a further understanding of the factors contributing to VHI.

Methodology
The subsections below present the methodology we followed in this study.

Participants
We conducted an a priori power analysis to determine the sample size of this study. For our calculations, we used 95% power, a medium-to-large effect size of f = 0.32, six (3 × 2) groups, and an α = 0.05. The analysis resulted in a recommended sample size of 120 participants (20 participants per group).
We recruited 120 participants (18-48 years old; M = 19.49, SD = 3.10) for our study through emails and class announcements made to undergraduate and graduate students in our department. Among our participants, 84 were male and 36 were female, equally distributed across participant groups. From the sample, 41 participants had no prior experience with VR, and 79 participants had previously experienced VR. All students volunteered for our study.

Virtual Reality Application
We developed our VR application for the study in Unreal Engine 4 and deployed it on an Oculus Quest 2 head-mounted display. The Oculus Quest 2 set had two controllers and an HMD with an internal camera sensor. The Quest 2 HMD was connected to a PC to run the application. We developed an instrumental task for our study because we noted several increasing applications of training and performing activities in VR. Additionally, we expected our participants to use their hands actively in performing certain tasks rather than passively observing their bodies. In our application, we implemented two sections: the first was the training (tutorial) section and the second wa the testing section (see Figure 2). We implemented the tutorial section to provide our participants with instructions and familiarize them with the use of virtual reality controllers. Doing so has been shown to significantly improve virtual reality users' performance [74]. Specifically, the tutorial section was reasonably straightforward, including only a gear part, a transparent gear slot, and a red button that started the testing section. The participants had the freedom to move around in the environment and pick up and assemble the gear onto the transparent slot. Participants would only need to use the grip button to "grab" the gear 3D model and "assemble" it by moving the gear toward the transparent slot. When the gear 3D model was close enough to the target slot, the gear would automatically "snap" onto the slot to stay in place. Participants could press the red button to proceed into the testing section whenever they were ready for the experiment.
The testing section included a long counter in the front, and a large white cube in the center with four transparent slots on the front side and four gears on the cube's left side. We created six variations of the testing section, where each variation was assigned one of the hand appearances, with or without tactile feedback activated. We set the frequency and the intensity of the tactile feedback of the Oculus Touch controllers to maximum (frequency = 1.00 and intensity = 1.00). The grasping motion of the hand was the same for all conditions. The task for the participants was to assemble all four gears onto the cube. After that, a transparent sphere would appear in the center of the cube's front side. Participants would put their right hand inside the 3D sphere model for two seconds. Then, a white spike would come out to "pierce" the virtual hand. Depending on the participant group, the tactile feedback of the controller was either activated or not. The spike was added as a metaphor to replicate the rubber hand illusion study. In the initial RHI study, a knife hit the fake limb at the end to explore the effect of synchronous stokes on hand ownership [59]. The training (tutorial) and testing sections of our application. From left to right: the tutorial level, which was provided to participants to familiarize them with the operations and mechanics (left); the testing section, in which participants were asked to assemble the gears (middle); and the spike "piercing" the virtual hand (right).

Experimental Conditions
By following a 3 (Appearance dimension: realistic vs. pixelated vs. toon hand appearances) × 2 (Tactile dimension: no tactile vs. tactile) between-group study design, we developed six experimental conditions: realistic tactile, pixelated tactile, toon tactile, realistic no tactile, pixelated no tactile, and toon no tactile. We used a disembodied body ("floating" hand) because we considered previous studies [12,48,75] and commercial applications, such as BOXVR and Job Simulator, that also used "floating" hands. Each group consisted of 20 participants (14 male and 6 female). Note that, in the conditions in which we activated the tactile feedback, the controller would vibrate when grabbing the objects and getting "pierced" by the spike.

Measurements
In our study, we collected self-reported data using the Slater-Usoh-Steed (SUS) scale [76] on presence, as well as though the avatar embodiment scales (virtual hand ownership, touch sensations, agency and motor control, external appearance, and response to external stimuli) by Gonzalez-Franco and Peck [32]. We used a 7-point Likert scale for all questions/statements (see Table 1). We disseminated the scales to our participants using the Qualtrics online survey tool offered by our university. Note that we made the necessary adjustments to the questions/statements to fit the scope of the study. To what extent were there times during the experience when the virtual environment was the reality for you? 1 indicates not at all, and 7 indicates totally. 3 During the time of the experience, which was the strongest on the whole, your sense of being in the virtual environment or of being elsewhere? 1 indicates being in the virtual environment, and 7 being elsewhere. 4 During the time of your experience, did you often think to yourself that you were actually in the virtual environment? 1 indicates strongly disagree, and 7 indicates strongly agree.

5
Hand Ownership I felt as if the virtual hand was my real hand. 1 indicates strongly disagree, and 7 indicates strongly agree. 6 It felt as if the virtual hand I saw was someone else's. 1 indicates strongly disagree, and 7 indicates strongly agree. 7 It seemed as if I might have more than one hand. 1 indicates strongly disagree, and 7 indicates strongly agree.

Tactile Sensation
It seemed as if I felt the vibration of the equipment in the location where I saw the virtual hand touch the object. 1 indicates strongly disagree, and 7 indicates strongly agree. 9 It seemed as if the touch I felt was located somewhere between my physical hand and the virtual hand. 1 indicates strongly disagree, and 7 indicates strongly agree. 10 It seemed as if the touch I felt was caused by the hand touching the object. 1 indicates strongly disagree and 7 indicates strongly agree. 11 It seemed as if my hand was touching the gears. 1 indicates strongly disagree and 7 indicates strongly agree.

Procedure
Upon arrival at our research lab, where the study took place, we asked the participants to read the consent form and sign it upon their agreement. The Institutional Review Board (IRB) of the university approved our study. After agreeing to participate in our research and signing the consent form, we asked the participants to provide demographic data in Qualtrics. Then, we asked them to put on the Oculus Quest 2 HMD and adjust it to their heads accordingly. After the adjustments and after indicating that they were ready for the study, the experimenter started the application from the tutorial section. After becoming familiar with the operations, the participants pressed the red button on the table to continue to the testing section, where they experienced one of the experimental conditions. As mentioned in the testing section, we asked the participants to finish the four-gear assembly first and then to experience the "piercing" of the spike. We used the four-gear assembly task to induce the participants into the virtual environment and to start thinking about their virtual hand as their hand. Then, the spike "pierce" was used as a metaphor for the RHI experiment. After the participants pressed the red button again, they would end the application and take off the headset. We then instructed the participants to fill out the post-survey. After that, the researcher answered the participants' questions, and, finally, the participants were thanked and dismissed. Each participant spent less than 30 min in our study. Figure 3 illustrates the experimental setting of our research and a participant using our application. Considering the current COVID-19 pandemic and following our institution's instructions, we allowed a 30-minute gap between participants to minimize infection risk. Participants were either vaccinated or followed the instructions of our university regarding regular testing (at least once per week) for COVID-19. Once the participants arrived at the lab, we asked them to sanitize their hands, and during the study, all participants wore face masks and VR sanitary masks. Additionally, the research team carefully sanitized all equipment and furniture before the participants' arrival and after each study session.

Results
In this section, we report all the results of this study. We performed all analyses using the IBM SPSS v.26.0 statistical analysis software. The normality assumption of all self-reported ratings was screened with Shapiro-Wilk tests at the 5% level and graphically using the Q-Q plots of the residuals. We tested the measured items of the scales for reliability (Cronbach's alpha: 0.72 ≤ α ≤ 0.90), and due to sufficient correlation, we used the cumulative score of all items for each scale as the final result and treated them as continuous scales [77,78], which is a typical practice. No removal of items would have enhanced the internal reliability of the scales. For all statistical tests, p < 0.05 was deemed as statistically significant. In addition to the results presented below, we would like to note that we also screened our data for gender and age differences. We did not find significant results; therefore, we do not report such analyses below. Figure 4 illustrates the boxplots of our results.

Response to External Stimuli
No (Appearance × Tactile) interaction effect was found (F(2, 118) = 1.073, p = 0.346, η 2 p = 0.018) on the response to external stimuli. However, we found a significant main effect for the Tactile dimension (F(1, 119) = 9.740, p = 0.002, η 2 p = 0.079). Post hoc comparison showed that participants exposed to the no-tactile level (M = 3.34) rated their response to external stimuli significantly lower than those exposed to the tactile level (M = 4.04). Finally, the analysis did not provide a statistically significant main effect for the Appearance dimension (F(1, 119) = 1.997, p = 0.140, η 2 p = 0.034).

Prior VR Experience
We conducted a three-way ANOVA (Appearance × Tactile × Prior VR Experience) to explore whether prior VR experience could potentially impact participant ratings (see Figure 5 for the interaction plots). For the agency and motor control, our analysis did not reveal either a three-way (Appearance × Tactile × Prior VR Experience) interaction effect (F(2, 118) = 0.393, p = 0.676, η 2 p = 0.007) or a two-way (Tactile × Prior VR Experience) interaction effect (F(1, 119) = 0.103, p = 0.749, η 2 p = 0.001). However, it revealed a twoway (Appearance × Prior VR Experience) interaction effect (F(1, 119) = 3.594, p = 0.031, η 2 p = 0.062). In the presence of a pixelated hand appearance, we found that participants with no prior VR experience provided a higher rating on agency and motor control than those with prior VR experience. We also found a significant main effect for Prior VR Experience (F(1, 119) = 6.831, p = 0.010, η 2 p = 0.059). Pairwise comparison showed that participants with no prior VR experience provided lower ratings for the agency and motor control than those with prior VR experience.
We also explored the response to external stimuli variable. According to our results, the ANOVA did not reveal a three-way (Appearance × Tactile × Prior VR Experience) interaction effect (F(2, 118) = 0.208, p = 0.813, η 2 p = 0.004). Moreover, we did not find a two-way (Tactile × Prior VR Experience) interaction effect (F(1, 119) = 2.544, p = 0.114, η 2 p = 0.023). However, we did find a statistically significant two-way (Appearance × Prior VR Experience) interaction effect (F(2, 118) = 3.559, p = 0.032, η 2 p = 0.062). Specifically, we found that in the presence of the pixelated hand appearance, participants with no prior VR experience provided lower ratings of their responses to external stimuli compared with those with prior VR experience. Finally, we did not find a main effect of Prior VR Experience in the response to external stimuli (F(1, 119) = 0.436, p = 0.510, η 2 p = 0.004).

Discussion
In this study, we aimed to explore how the appearance of virtual hands (Appearance dimension) and tactile feedback (Tactile dimension) impacted our participants in terms of presence and body ownership dimensions (hand ownership, touch sensation, agency and motor control, external appearance, and response to external stimuli). In addition, we reported significant findings regarding the effect of participants' prior VR experience. The subsections below discuss our findings and our study's limitations and design considerations.

RQ1: Realistic Appearance
To answer RQ1, according to our results, we first found significant Appearance effects on hand ownership. Participants who had the pixelated hand reported significantly lower hand ownership than those who had experienced the toon and realistic hands. Previous studies have already proved the effect of avatar realism in immersive environments, indicating that a realistic avatar tends to generate stronger virtual body ownership [62,79,80]. Moreover, studies regarding virtual hands specifically mentioned that a more realistic/human-like hand style would provide stronger hand illusion and perceived limb ownership when compared with hand models of different geometries and abstractions [9,15,19]. In addition, Pyasik [64] found that detailed hand appearance could be a contributing factor for VHI, and later studies [64,81] confirmed this conclusion. In our research, the pixelated hand appearance obscured the details of the virtual hand. Our study results extend the previously reported findings indicating that even appearance changes (we kept the geometry of the virtual hand as is) could also impact the hand ownership illusion of participants. We think that the pixelated hand provided a sense of style clash in contrast with the virtual environment to the participants, which made them rate the hand ownership as lower compared with those exposed to the toon or realistic hand appearance. Therefore, we think a more detailed appearance (realistic and toon hand appearance) was logically likelier to generate a stronger sense of hand ownership than a less detailed one (pixelated hand appearance). Furthermore, no participants provided negative comments about using a hand model that resembles a real hand, which is a similar finding to previously published studies [9,15].
Our statistical analysis also revealed statistically significant results of Appearance on external appearance. Participants exposed to the pixelated hand rated the appearance lower than those exposed to the realistic hand appearance. Previously conducted studies revealed that avatar body style was related to appearance realism [40,82], as well as that the realistic hand model had a stronger effect on the ratings of external appearance compared to other less realistic hands [12]. In addition, based on the comments from the participants, the pixelated hand was referred to as "blurry," "mosaic," and "not clear enough to recognize." In the previous studies, researchers mentioned the visual perception of virtual hands in implicit and explicit methods [37,38,83]. Referring to the comments of our participants (e.g., "I felt my hand was a bit blurry," or "I could not connect my pixelated hand to the real hand"), we think the pixelated hand appearance did not help participants perceive the appearance of the virtual hands in both explicit (size and shape) [37,83] and implicit (location of landmarks-in this case, finger joints and wrist) [38] directions-the pixelated hand appearance made the virtual hand unclear and provided less information. The aforementioned is especially true when considering that the realistic hand model contributed a distinct visual appearance.

RQ2: Tactile Feedback
There were several statistically significant results in terms of tactile feedback (RQ2). We first found a significant effect on presence. Participants exposed to no tactile feedback conditions rated their presence levels lower than those exposed to tactile feedback. Previously conducted studies have found that tactile feedback in different avatar body parts can generate a stronger sense of presence [30,84], as well as that those participants exposed to no tactile feedback conditions reported lower presence levels than those exposed to tactile feedback [53]. Khamis et al. [13] also explored the effect of tactile feedback on presence. Their study proved that activating tactile feedback resulted in a higher reported sense of presence when compared with no tactile feedback conditions [13]. In our experiment, tactile feedback was activated when touching the gears and being pierced by the spike. We think that tactile feedback made participants feel that their hands were in the virtual world along with other objects. In contrast, the participants who were not exposed to tactile feedback indicated a lack of relationship between the existence of the virtual world and object interaction, which is a result that confirms previously published works [13,30,53,84].
Participants who were exposed to tactile feedback reported stronger tactile sensations when interacting with virtual objects than those not exposed to tactile feedback. Fossataro et al. [58] found that tactile feedback could reduce delusional body ownership for brain-damaged patients. This result reveals that tactile sensation could regenerate and enhance touch sensation for hands. Other studies proved that tactile feedback provided a stronger sense of touch to VR users [11,85]. In this experiment, we instructed the participants to grasp and manipulate gears. The tactile feedback was activated when grabbing the gear, providing participants with a sense of touch location and object interaction.
Finally, we found that tactile feedback significantly affected the response to external stimuli. For our experiment, the RHI idea was borrowed and altered. We used a spike to pierce the participants' right hand. Participants who were exposed to tactile feedback rated their response to external stimuli more highly than those who were not exposed to tactile feedback. In the original rubber hand illusion study, Botvinick and Cohen [59] found that synchronous stroke generated the sense of "phantom limbs" and noted the reactions of participants after the "stab." In a more advanced setup, D'Alonzo et al. [86] also found that vibrotactile feedback impacted the phantom limb sensation on a prosthetic limb, while also causing a stronger reaction to stimulation. Vargas et al. [83] mentioned that vibrotactile feedback could increase the proprioceptive recognition over a fake limb and could be the reason for a more significant reaction to external stimuli. As for our experiment, the assembly process made participants feel that the virtual hand became their hand, since they controlled it. Therefore, the activated tactile feedback caused participants to have a more realistic reaction when the spike came out and pierced their virtual hands.
On the flip side, participants who were not exposed to tactile feedback did not react to the spike, even though they saw the spike piercing their virtual hand.

RQ3: Prior VR Experience
We found interesting results induced in our participants by their Prior VR Experience (RQ3). Our first finding was the interaction effect between Appearance and Prior VR Experience on agency and motor control. We found that participants with no prior VR experience reported higher ratings of agency and motor control in the presence of the pixelated hand appearance. As for the interaction effect, perhaps participants with no prior VR experience did not have in mind other virtual hand appearances to compare. Therefore, they decided to rate their agency higher than the participants with prior VR experience that most likely had in mind a reference hand used in the past. We also found a significant main effect on agency and motor control. Participants with prior VR experience rated these aspects higher than those with no prior VR experience. According to Haggard [87], past VR experience could be a reason for the impact on agency during specific actions. Bregman-Hai et al. [88] also revealed the relationship between memories and agency. In our study, most participants had previously experienced VR. They were familiar with VR controls and could finish the tasks smoothly. On the flip side, we think that participants with no prior VR experience had to make extra effort to get used to VR devices, and perhaps this was why they decided to rate their agency and motor control as lower.
We found another significant interaction effect between Prior VR Experience and Appearance on response to external stimuli. Specifically, we found that in the presence of the pixelated hand appearance, participants with no prior VR experience provided lower ratings of the response to external stimuli compared with those with prior VR experience. Based on the comments from our participants, those who did not have prior VR experience considered this VR application interesting and provided higher ratings when exposed to the pixelated virtual hand. The VR-naïve participants did not mention any contradiction between the virtual hand style and the tactile feedback. On the other hand, the VR-experienced participants reported unnaturalness with the virtual hand. They expected a higher fidelity realistic hand for the interaction. We think that the pixelated hand, which had a "mosaic" appearance, made participants feel that such a hand model could not provide a realistic response to external stimuli. For this reason, this participant group decided to give lower ratings. Moreover, the participant group with prior VR experience might have found a match between the low-quality hand appearance and tactile feedback. We think that this "quality matching" would positively impact such a group of participants. However, we could not find any previously conducted studies to support such findings. Therefore, we argue that future research exploring this interaction effect is necessary to understand such results further.

Limitations
Our study has several limitations that should be reported and considered in future studies. Unfortunately, most of our participants were undergraduate students from our department. Thus, we could not divide them into near-equal-size age groups due to the close age range. Furthermore, although we tried to recruit female participants, there was still a difference between male and female participants in each group. We think that such a balance (14 males and 6 females) might have impacted our findings.
The hardware (tactile feedback device) used in this study is another limitation. The controllers of Oculus Quest cannot provide rich contact and shape information when interacting with virtual objects. In our case, compared with haptic gloves and fingertip devices, the controller could not generate detailed feedback for the gear and the spike. Pacchierotti et al. [89] asserted the importance of tactile feedback quality. Thus, we think that using a high-quality tactile feedback device would provide us with more accurate study results.
Furthermore, participants were exposed to a relatively short tutorial section. Based on our discussion, we assume that participants without prior VR experience did not have enough time to get used to the application and the VR system. Therefore, a longer and more complex tutorial section might be necessary for better adaptation.
We consider the design of the spike as an additional limitation. Some participants were able to sense the tactile feedback of the spike but did not instantly notice the spike. Other studies, such as Lin and Jörg [9], used a knife falling from midair, which was more visually appealing. Future studies should provide a more exaggerated visualization as a VR metaphor for RHI.
Finally, we would like to mention a limitation with regard to handedness. We noticed that some participants used their left hand as their dominant hand during the study. We asked our participants to use their right hands to experience the spike in the spike part of the study. It would be interesting to explore the effect of the dominant hand in this scenario.

Design Considerations
We think the knowledge obtained from this study should be documented to help the research community develop more efficient VR assembly applications that involve virtual hand appearance and tactile feedback. Therefore, this section presents our reflections on choosing an appropriate hand representation and the usage of tactile feedback.
Because hand appearance does affect hand ownership and external appearance, we argue that it is more appropriate to use toon-or realistic-style shaders for virtual hands. Unless a future study confirms the positive findings on the matching between low-quality hand appearance and tactile feedback, we support the avoidance of using pixelated hand appearances as such representation induces lower VHI, and people regard it as "unable to perceive" and "not visually clear." On the flip side, in terms of tactile feedback, we recommend the activation of vibrotactile feedback to provide a realistic VR experience when interacting with objects, since tactile feedback impacts presence, touch sensation, and response to external stimuli. We think all of the above could be effective tools for enhancing virtual hand illusion.

Conclusions and Future Work
We conducted a VR study to explore the effects of hand appearance and tactile feedback in terms of presence and body ownership dimensions. We developed a VR application in which participants were asked to assemble different parts into a 3D model, with an imitation of the RHI study at the end of the tasks. On one hand, in response to RQ1, a pixelated hand appearance induced a lower perception of hand ownership and external appearance. On the other hand, in response to RQ2, tactile feedback generated a stronger sense of presence, touch sensation, and response to external stimuli. To answer RQ3, we found that prior VR experience enhances agency and motor control while also reducing the response to external stimuli ratings.
As for our future work, there are several avenues to investigate. In modern VR, we already have applications that can make good use of legs, feet, and even the upper body. Thus, in future research, we would like to explore the effect of the realistic appearance of the whole body of self-avatars. Moreover, the effect of the parameters of tactile feedback (e.g., intensity and frequency) on the ownership of virtual hands could be considered a direction worth exploring. Such studies can help us to better understand how we can evaluate the realistic appearance and tactile feedback in self-avatars and user-controlled body parts in VR applications. Finally, we can also deepen the research on hand ownership using the Oculus Quest built-in hand interaction model for more approaches than real-world controllers. Informed Consent Statement: Informed consent was obtained from all subjects involved in the study.
Data Availability Statement: Data and experimental materials used in this study are available upon request.

Conflicts of Interest:
The authors declare no conflict of interest.