Comparing VR-and AR-Based Try-On Systems Using Personalized Avatars

: Despite the convenience offered by e-commerce, online apparel shopping presents various product-related risks, as consumers can neither physically see nor try products on themselves. Augmented reality (AR) and virtual reality (VR) technologies have been used to improve the shopping online experience. Therefore, we propose an AR-and VR-based try-on system that provides users a novel shopping experience where they can view garments ﬁtted onto their personalized virtual body. Recorded personalized motions are used to allow users to dynamically interact with their dressed virtual body in AR. We conducted two user studies to compare the different roles of VR-and AR-based try-ons and validate the impact of personalized motions on the virtual try-on experience. In the ﬁrst user study, the mobile application with the AR-and VR-based try-on is compared to a traditional e-commerce interface. In the second user study, personalized avatars with pre-deﬁned motion and personalized motion is compared to a personalized no-motion avatar with AR-based try-on. The result shows that AR-and VR-based try-ons can positively inﬂuence the shopping experience, compared with the traditional e-commerce interface. Overall, AR-based try-on provides a better and more realistic garment visualization than VR-based try-on. In addition, we found that personalized motions do not directly affect the user’s shopping experience.


Introduction
With the continuous development of internet and mobile technology, apparel e-commerce is rapidly expanding worldwide. Nowadays, the number of consumers purchasing clothes online is increasing [1]. However, approximately $62 billion worth of returns occur annually in the fashion industry [2]. In addition, almost 56% of consumers reported apprehension about physical garment style and fit as major sources of concern with online shopping. The majority of online shopping websites display garments through 2D photos of garments and human models. Consumers purchasing decisions will mostly depend on the 2D images of garments, without being able to look at the piece of clothing in detail or better understand the garment style. Apparel is highly connected with the physical body figure of an individual. On typical online shopping websites, which is a purely digital environment, consumers are unable to physically try on clothes. Discrepancies may exist between the actual products and the perceived body sizes, making it difficult for consumers to determine fit [3], which may negatively impact their shopping experience and purchase intention [4,5].
Today, with the growth of the virtual try-on (VTO) technology, consumers can browse a broad range of products and try items on in the online shopping environment. VTO technology provides a virtual try-on experience by simulating the consumer's physical figure with virtual models, based on human measurements [6]. VTO can meet the various needs of consumers and assist consumers to accurately assess fit and size in online shopping environment [6,7].
Potential concerns with the VTO simulations, such as accuracy, may give rise to consumer dissatisfaction with the fitting results, thus VTOs have not been widely adopted to date [7,8]. The poor representation of consumer's actual body and the lack of face representation may lead to an unrealistic perception of VTO, thus strongly affecting the quality of the consumer shopping experience. Therefore, prior work has concentrated more on accurately simulating human representation. The low level of accuracy could be partly due to the lack of advanced 3D avatar technology. Some VTO used a 3D virtual avatar created by reflecting body measurements and facial features of users [9]. This 3D virtual avatar may increase the accuracy of virtual fitting and enhance the hedonic experience while shopping [10]. Furthermore, when trying on clothes in real life, consumers may pose in front of the mirror to further assess the fit and style of the garments. Perception features, such as personalized motion, may also influence the perceived accuracy. However, most VTOs emphasize physical body measurements to personalize a virtual self to create a physical fitting experience that is as accurate as possible, thus overlooking the impact of personalized motion in the try-on experience.
Recently, AR and VR technologies have been applied in various fields, such as travel, education, entertainment, training and fashion [11][12][13]. AR technology refers to the overlay of virtual 3D objects onto one's view of the physical real environment [14]. AR in online shopping apps has been also gaining momentum-for example, eyeglasses or jewelry try-on mobile apps. These apps helped the customers try on different eyeglasses or jewelry without actually putting them on physically [15]. Using AR technology, VTO enables consumers to try a number of products augmented on a mirror image of themselves via a digital display with a camera directed at them. This is commonly called a "magic mirror" [16]. A number of fashion firms employed AR fitting rooms in the form of a magic mirror, including Uniqlo, Gap, etc. However the magic mirror is usually used for real shops. Therefore, some fashion brands, such as RIXO London and Tommy Hilfiger, allowed consumers to perform virtual fitting and view the virtual catwalk with the use of a mobile app [17] using VR technology, which created a realistic-looking world in an entirely synthetic environment using computer graphics. With this type of VTO, consumers can feel like they are present in the simulated fitting room [18]. VR fitting has the potential to alter the consumer experience in a new way because it can provide a more immersive fitting experience. Until now, the response of consumers toward VTO during their shopping experience has shown mixed results, and the variation between AR-and VR-based try-ons has not been examined.
In this paper, we discuss e-commerce, VR/AR technologies, and how AR/VR may create value for online retailers by overcoming the limitations of traditional e-commerce. We also examine how personalized motion can contribute to the consumer shopping experience.
To validate how AR/VR may contribute to online shopping, we compare the roles of AR and VR in the VTO experience. To understand how personalized motion is related to consumer attitudes toward the shopping experience, we developed an interactive VTO system that supports personalized interactions through a smartphone. Users can view their dressed body in the real world from different viewpoints. In addition, we propose that the virtual body model includes the user's own personalized motion. This allows users a better understanding of whether the garment is suitable for them while moving in their own unique way. The system implements interaction techniques across the AR and VR technologies and demonstrates how each can be used for online shopping tasks. To gather feedback on the AR/VR try-on interaction techniques and personalized motion, we ran two user studies that provided insights, observations, and guidance for future work.

AR-and VR-Based Try-On
VTO is one form of image interactivity technology (IIT) [10] that simulates users' online shopping experience by using AR or VR technology. Previous research on VTO can be divided into AR-based virtual try-on and VR-based virtual try-on.

AR-Based Try-On
AR is a computer-simulated interactive technology that enriches the user experience by integrating additional information into the user's real world. AR provides users with more realistic self-descriptive product experiences. Previous research studied two types of AR VTO: (1) 2D Overlay AR-based VTO superimposed images of products onto users' body; (2) 3D AR-based VTO provides 3D visualization of products onto user's body.

•
2D Overlay AR-Based Try-On Earlier work on 2D overlay virtual try-on was mostly conducted in computer graphics [19][20][21]. The 2D overlay VTO overlays a projected 2D image of products onto an image of the user and the real environment around the user. Hilsmann et al. re-textured garment overlay for real-time visualization of garments in a virtual mirror environment [22]. Yamada et al. proposed a method for reshaping the garment image based on human body shapes to make fitting more realistic [23]. Zheng et al. aligned the target clothing item on a given person's body and presented a try-on look of the person with different static poses [24]. However, 2D overlay virtual try-on does not adapt well to dynamic poses when users perform certain activities. In addition, like many other re-texturing approaches, they operate only in 2D without using 3D information in any way, which prevents users from being able to view their virtual self from arbitrary viewpoints.  [25].
In general, the 3D AR-based try-on is better than 2D overlay AR-based try-on. Because the 3D garment model matching to 3D avatars presents a more accurate representation of the garment and its fit. It also provide users a multi-angle view of the garments. In our research, we propose a method for users to view the fitting interactively, and enable users to check the garments by augmenting the motion of personalized avatar in the real world.

VR-Based Try-On
Virtual reality technology generates a fully virtual environment by utilizing computer graphics. Recently, using the VR headset, a VR-based try-on system provides a completely immersive shopping experience for users [28]. Users can feel like they are presenting in a simulated fitting room. The virtual shopping experience provided by Alibaba's Buy+TM and DiorEyeTM can be examples [17]. At the current stage, customers can see static and dynamic 3D looks of a virtual model wearing different clothes [28] and check the fitting based on different virtual scenes [29]. Some other researchers tried to provide a real-time walking animation in a virtual fitting room [30]. Through a virtual walk in VR, users can get a realistic view by thoroughly watching and examining the products with details from different angles [17].
The VR-based try-on technology can provide a more immersive and real experience. We believe that it has great potential to change the consumer experience in new ways. In this paper, we create a personalized VR-based try-on system, user can watch and examine the fitting with their own models from different angles.

Personalized Virtual Avatar on Concerns with Virtual Try-On
The main problem with online shopping is the lack of direct try-on experience, which may lead to an increase in perceived risk of purchases due to the difficulty in judging the product fit [31]. Some literature provide virtual fitting experiences on a default virtual avatars rather than one generated based on the user's own body [32,33]. The absence of "how they fit" may influence customer purchase intention when shopping online and decrease consumer enjoyment in the shopping process. •

Personalized Virtual Avatar
Compared to standardized avatars, customizing a virtual avatar with their own features may provide users a sense of virtual self. The creation of realistic self-avatars is important for VRor AR-based VTO applications that aim for improving the acceptance of personalized avatars, thus providing users a more realistic and accurate try-on experience. Recent studies focused on investigating the personalized VTO experience with a customized virtual avatar created using the user's own face and body figures. Personalized VTO provides a more realistic user experience when users try clothes on their virtual self [10]. Yuan et al. customized a partially visible avatar based on the user's body size and skin color, and used it for proper clothes fitting. They found that using a personalized avatar can increase customer purchase choice confidence [34]. Nadia Magnenat-Thalmann et al. proposed a VTO system that allows users to virtually fit physically-simulated garments on their generic body model [35]. Yang and Xiong found that a VTO experience with a personalized avatar significantly increases customer satisfaction and decreases the rate of product return [36]. Moreover, with females as the main target customers of online shopping, they usually have a high body esteem of their virtual body. Anne Thaler et al. investigated gender differences in the use of visual cues (shape, texture) of a self-avatar for estimating body weight and evaluating avatar appearance. In terms of the ideal body weight, females but not males desired a thinner body [37]. •

Virtual Try-On Levels of Personalization
Depending on the avatar's level of personalization, the avatar representing the user may or may not provide a real sense of self [38]. According to the avatar's similarity to the user, virtual try-on systems can be divided into four levels, as proposed Merle et al. [10]: (1) Mix-and-match: Same as the traditional online shopping where users can select the products using only online images. (2) Non-personalized VTO: Some virtual try-on experiences based on a default virtual avatar rather than one generated from the user's own body [29,39,40]. The lack of precision in describing users and products reduces users' virtual try-on experience. (3) Personalized VTO: Virtual avatar models are customized with personal features (face color, height, weight, bust size, and body shape). (4) Highly personalized VTO: Virtual avatar models are customized with personal features, including the face model.
The highly personalized VTO requires more personalized information, which leads to better information recall for users. Users can gain a better understanding of wearing clothes on their own body, increasing their purchase choice confidence [41][42][43][44].
In this paper, we propose a highly personalized motion VTO system. We customized the virtual avatar based on the user's own face and body figures. We personalized the user's actual posture or movement as virtual avatar animation.
Comparing with the existing system, our proposed AR and VR try-on system tries to make the improvement from these three parts: (1) Personalization: Existing VTO systems only personalized user's presence of the real body (body shape and face image). Our proposed AR-and VR-based try-on system improves the level of personalized avatars. We provide a new way for users to view the virtual garment by the augmented personalized motion. (2) Interactivity: Most existing VTO systems only overlay the 2D images of garment onto the user's real body, without using 3D information and not allowing users to check the garment from different viewpoints. Our proposed AR-and VR-based try-on system enables users to view a life-size personalized avatar with garment models and posing or walking augmented in the real-world. Users can view the virtual garment interactively and immersive in 360 degrees. (3) Realism of garment model: Existing VTO systems usually use some pre-defined garment model.
Our proposed AR-and VR-based try-on system generates a 3D garments model based on some existing website information.

System Design
To provide a better virtual try-on experience, we explored the relationship between physical try-on and 3D virtual try-on ( Figure 1). When customers shop in a physical store, their experiences can be divided into several categories: human activity, human body, garment, and environment. We describe how our system contributes within each of these categories.

•
Human Activity: Daily life activities and motions.
When trying on clothes physically, consumers often like to move their body in such a way that mimics their daily life poses or actions in order to observe the dynamic details of the clothes. We prepared several natural daily life motions to simulate the consumers' activities in the real world, such as walking, sitting, waving, etc.
• Human Body: The basic human body property information such as sex, face, height, and body shape.
During physical fitting, consumers have the opportunity to actually try the garments on and choose clothes that suit their own body. With 3D virtual try-on, we created a personalized virtual avatar based on the user's body figures, face photos, as well as users' personalized actual posture or movement.
• Garment: Clothing style, clothing fit, and garment type. For 3D virtual try-on, we generated a 3D garment model library for users based on the garment image information from existing online shopping websites. • Environment: Environmental conditions of the try-on experience.
During physical fitting, consumers can actually try on clothes in the real world or wear the clothes under different conditions, such as walking in the street, working in the office, etc. Using VR technology, the 3D virtual try-on incorporates several virtual scenarios, simulating different physical scenes, which can help users make decisions according to different wearing conditions.
In these ways, we propose a 3D virtual try-on system using personalized avatars to simulate the physical try-on experience as accurately as possible, based on these categories. With our system, users can view their own life-size personalized virtual body with garment models, posing or walking augmented in the real world. Our system consists of three main stages: personalizing avatar, garment model generation, and 3D virtual try-on.

Personalized Avatar
The presence of a human model during product presentation activates the effect of self-referencing [45]. For instance, female consumers tend to compare themselves with human models and evaluate their perceived similarities based on ethnicity and body size [46]. Therefore, the similarity of the virtual human avatar to the consumers themselves can directly affect their experience in assessing the details of the garments, such as color, fabric, style, and the fit on their virtual models [38,47,48].
The virtual human body should have an appropriate 3D representation corresponding to the real user's body, face features, and body movement. Daily life motions, such as walking and sitting, may potentially provide users a lifelike shopping experience. Since each user has their own style of movement and posture, the similarity of the virtual avatar's motion compared to the users real motion may influence user purchasing confidence. Therefore, we proposed that virtual body models should include users own personalized motion, which are captured individually by each user. In this section, we discuss in detail the process of human model generation and the personalized motion of the avatar.

Human Model Generation
To allow users to assess how well the displayed products match their actual body, we personalized users' own virtual avatar corresponding to the real user's human body shape and face features. Figure 2 provides an overview of human model generation, which consists of two main stages: face model generation and body model generation. We generated the 3D body shape model of the user using 3DLook software (3DLOOK, San Mateo, CA, USA) [49] and generated the 3D face model of users using 3D Avatars SDK (AVATAR SDK, Santa Clara, CA, USA) [50].

•
Face model generation 3D Avatars SDK [50] combines complex computer vision, deep learning, and computer graphics techniques to turn 2D photos into a realistic virtual avatar. Using 3D Avatars SDK, we created a 3D face model from a single frontal facial image, as well as a fixed topology of a head that included user hair and neck.
• Body model generation 3D LOOK [49] is 3D body model generation software. To obtain a 3D body model of the users, we collected the user's basic information about their body: height and weight. As shown in Figure 2, two full front and side body photos were used as inputs, and the 3D virtual avatars can be obtained.

Figure 2.
Our system uses three elements as inputs: a single full-front face image and two full body photos (front and side), then we generated a personalized avatar for each user.

Personalize Motion of the Avatar
To allow users to gain a sense of the real fit of the clothing on themselves, we personalized their virtual avatar. However, when users try on clothes in an offline shop, they may perform certain activities (e.g., sitting, walking, posing, etc.) to check whether the clothes are suitable. Previous research provided virtual try-on with motions. Gültepe et al. provided a realistic fitting experience with customized static poses using a depth sensor [51]. Adikari et al. introduced a virtual dressing room for real-time simulation of 3D clothes with users performing various poses [52]. These methods did not allow users to view clothes that matches their body from arbitrary angles. So far, there is a lack of research exploring the dynamic VTO experience with personalized motions.
We personalized the animation of the virtual avatar with user's movements, which allows users to gain a sense of wearing clothes on their own body with their own poses. The work flow of personalizing the avatar motion is shown in Figure 3, which consists of three sections: motion capture, personalized movement, and animation library. To gather users' individual movements, we use Kinect V2 depth sensor (Microsoft, Redmond, WA, USA) [53] to record postures and user movements and create their own animation library for our system. The recorded animations are then smoothed out using Maya (Autodesk, Mill Valley, CA, USA) [54], and are then attached to the user's virtual avatar.

Garment Model Generation
Compared with the traditional method of displaying garments on shopping websites, 3D garment models can enhance product presentation and help consumers to better visualize garments. 3D models for fashion items has many applications in VR, AR, and computer-aided design (CAD) for the apparel industry [38].
Our approach uses garment image information from existing online shopping websites to create a virtual garment library. Textures were extracted from the garment images and mapped onto the 3D garment model in 3Ds max (Autodesk, Mill Valley, CA, USA) [55].

3D Garment Model Templates
Marvelous designer (CLO Virtual Fashion, New York, NY, USA) [56] is a popular 3D software used for 3D garment design based on 2D sewing patterns. Marvelous designer includes a garment template library that is used as the basis for creating various garment models. We customized several 3D garment model templates for the personalized human model using Marvelous designer [56]. We provided some 3D clothing templates of t-shirts, skirts, shirts, dresses, pants, etc. (Figure 4) based on the user's preference.

Texture Mapping
We collected garment images from existing shopping websites, such as H&M [57] and ZARA [58], and mapped these clothing images to generate 3D garment model templates ( Figure 4).

3D Virtual Try-On
Our system was developed using Unity 3D on Windows 10 and we deployed our system on Android smartphones. To compare the different influences of virtual try-on experiences between AR-and VR-based try-ons, we design two systems that differ in the fitting environment (virtual environment vs. real environment), as shown in Figure 5. VR-based try-on displays 3D garment models that allow users to view their personalized human model wearing garments in a virtual environment; AR-based try-on allows users to manipulate their personalized model wearing garments into a real environment.

VR-Based Try-On
Environmental conditions are some of the factors that influence the shopping experience, since consumers make decisions according to the trying on experience. We constructed several virtual scenes that simulate wearing the selected clothes in variety of settings, such as on the street, in the office, and at the supermarket. Users can view the virtual garment based on the different virtual scenes, providing users a more realistic image of what they would look like in various occasions or for various purposes ( Figure 6).

AR-Based Try-On
With AR-based try-on, consumers can see themselves wear garments in their real-life setting. Users can detect the ground plane and place the user's virtual body into a real-life scene, in life size. Thus, users can view a life-size personalized virtual body with garment models with augmented posing or walking in the real world.
To understand the role personalized actions play in the AR-based try-on experience, we designed three scenarios for validation: 1. No-motion VTO: Users can choose various styles of clothing on the left, and the 3D avatar on the right will show the effect of the clothes, displayed statically. 2. Predefined-motion VTO: This system enables users to match clothes on their personalized virtual avatar with pre-defined animation, and dynamically view the effects. 3. Personalized-motion VTO: This system enables users to match clothes on their virtual avatar with personalized motion, and dynamically view the effects.

Hardware Overview
We used a Google Pixel 3 (Google, Mountain View, CA, USA) smartphone and Kinect Xbox (Microsoft, Redmond, WA, USA) [53] as our primary hardware components. To capture motion and skeleton information for each user, we used a Microsoft Kinect V2 sensor as the depth sensor, and tracked their body movement. The depth sensor is connected to a computer with Windows 10 operating system using a USB 3.0 controller and a Kinect adapter for Windows.

Software Overview
• Development Tools We developed our software using the Unity game engine (Unity Technologies, San Francisco, TX, USA) [59] version 2019.1.14f1 with Vuforia SDK and Cinema Mocap for Unity. Vuforia SDK was used to recognize the ground plane and create an AR experience. Cinema Mocap [60] was used to convert body tracking data to animation clips. •

Personalized Motion
To convert the captured mo-cap data into animation, we used a Unity plugin, Cinema-Mocap, which is a marker-less motion capture solution for Unity to create customized animations for users. As the movement captured by Kinect V2 depth sensor is quite jittery, we edited and smoothed the animation frame-by-frame in Maya [54], which is a 3D computer animation, modeling, simulation, and rendering software. The animations are then imported into Unity and the animations are attached to the virtual avatars.
Our framework for motion capture using Kinect is shown in Figure 7. The movement of the users in the real world are converted into the animation of the avatar in the virtual world using these three steps: (1) Record the movements of users: We recorded users' movements using a Kinect V2 depth sensor and Cinema Mocap in Unity. Kinect V2 sensor was used as a skeleton camera to track users' body motion; Cinema Mocap in Unity was used to convert the motion capture data to avatar animation. (2) Smooth out animation: The animation captured by the Kinect V2 sensor contains trembling movements, which we smoothed out in Maya. (3) Attach animation to virtual avatar: Before attaching the animation to the avatar, we needed to rig the skeleton and the skin to the virtual avatar using Mixamo [61]. Then, we used an animation controller in Unity to control the virtual avatar to perform humanoid and realistic animation.
Following these three steps, users can view the virtual avatar with their personalized motion inside the VR or AR environment. •

Garment Model Generation
Previous research on garment modeling started from the 2D design pattern or 2D sketches. While other methods explored garment resizing and transfer from 3D template garments [27]. Compared to 2D design patterns, 3D garment models simulate the garment more precisely. Therefore, our method extends these methods to map the 2D image to the 3D garment models. The texture-mapping method increases the realism of the garment.
We generated 3D garment models from images on online shopping websites, such as H&M and ZARA. We used Marvelous designer [56] to create a 3D garment model. The texture mapping method is shown in Figure 8. The 3D mesh of a generated garment template can be flattened into a 2D texture UV map in 3Ds Max. The 2D UV map contains several parts. For example, in the case of a T-shirt, it can be segmented into three parts: sleeves, front, and side. To map the web garment image to a 3D virtual garment template, we mapped the different segmentation parts from the garment image to its corresponding parts on the garment template. In this way, we can generate the 3D textured garment model. Figure 8. Mapping web garment image to generated 3D garment templates.

Goal
Both AR and VR can provide users with an enriched and immersive experience during virtual try-on. Specifically, when compared to traditional online shopping websites, VR and AR are perceived as more useful for users since they can provide a fuller visualization of the look and feel of the product. However, it is difficult to ascertain which technology is better for virtual try-on. Virtual avatars with personalized human motion increase avatar identification and produce high avatar similarity, which may offer a better understanding of users' own selves and improve their overall body satisfaction. High similarity of virtual avatars enhances the user attitude toward the body-involving product and the perceived usefulness of the avatar [36,62].
Therefore, the goal of our research is (1) To validate how AR/VR-based try-on system may contribute to online shopping. (2) To understand how personalized motion is related to consumer attitudes toward the shopping experience.
Based on the goal of our system, we hypothesized that: H1). AR-and VR-based try-on technologies are better than traditional online shopping and provide consumers with positive experiences.

Hypothesis 2 (H2).
Comparing with VR-based try-on, AR-based try-on technology provides a better representation of the product and shows how it fits the environment.

Hypothesis 3 (H3)
. AR-based try-on with personalized human motion can provide a more engaging shopping experience and thus increase their purchasing intention.
Two experiments (user study 1 and user study 2) were designed to test our hypotheses. User study 1 was conducted to examine the differences between the roles of VR-based try-on and AR-based try-on (H1, H2). User study 2 was designed to investigate the impact of personalized movement on the virtual try-on experience (H3).

Participants
A total of 12 college students participated in both user studies 1 and 2 to test our proposed hypotheses. Participants were offered extra credit as an incentive to complete the study. According to Merle, there is no significant gender difference in the overall VTO adoption process [10]. However, women represent the largest online apparel segment, and they are more sensitive to online shopping [63,64]. Adults aged 18-30 years are usually the target users for AR or VR applications, as they are more likely to try new technologies and they are proactive in online shopping for fashion products. Hence, we invited 12 women (mean age = 22.9 years, SD = 1.62) to participate in the experiment.
Before the experiment, we administered a pre-questionnaire to each participant to assess their online shopping experience and experience with AR or VR technology. We gathered the responses to pre-questionnaire from all 12 female participants. We found that all participants had the habit of shopping online every month. Most participants had more than four years of online shopping experience in buying clothes. Eleven had encountered the fit problem when buying fashion products online. Most participants had some knowledge of the VTO experience and had previously experienced VR/AR. We investigated the participants' clothing preferences and found that the type of clothing most popular is T-shirt, followed by dress, short skirts, and coat. Therefore, we prepared clothes templates according to their clothing preferences.

Experimental Design
In order to verify hypotheses H1 and H2, and to understand consumers' attitudes toward the VR-based try-on, AR-based try-on, and the traditional image-only e-commerce interface, we conducted a within-subject study under three conditions: In this study, a total of 12 participants were asked to complete all three conditions, and the order of presentation were randomized. After each shopping experience, participants were asked to complete a questionnaire. We also added various open-ended questions to capture participants' reflections on ARand VR-based try-on technology.

Measures
The respondents completed the questionnaire using a 7-point Likert scale (ranging from 1, "strongly disagree", to 7, "strongly agree") adopted from existing research. The items on perceived enjoyment were adapted from Childers et al. [65]. Those on perceived purchase intention were from Chandran & Moreitz [66], perceived attitude toward the shopping technology from Chen et al. [67], and perceived usefulness of the shopping technology were from Davis [68]. We measured enjoyment, convenience in examining the product, worries about the fit problem, usefulness, and purchase intention items in all three conditions. In addition, we measured garment visualization and attitude toward the shopping technology items in both VR-and AR-based try-on conditions. The questionnaire and measurement items are shown in Table 1.

Measurement Items Questions
Enjoyment Shopping with this system was enjoyable for me.
Convenience in examining the product I gain a sense of how the outfit might look for various occasions.
Garment visualization Having a model in a virtual environment real environment helps me understand more about the appearance of the garments.
Worry about fit problem I feel worried that the clothes I choose may unsuitable for me.

Usefulness
This shopping system would enhance the effectiveness of the shopping experience.
Purchase intention It is very likely that I would purchase this product.
Attitude toward the shopping technology I want to use this system when I buy clothes online in the future.

Results
We separated the results into two sections: (1) analysis of the ratings from the questionnaires and (2) thematic analysis of the participants' comments. The overall results of the analysis showed that the AR-based try-on preformed the best, being the preferred choice for participants. The worst perceived was the traditional online shopping interface. •

Differences in Ratings
For statistical analysis of differences, we used repeated measures one-way ANOVA. The study we conducted meets almost all the assumptions required for one-way ANOVA repeated measures. As for the normality assumption, it is not necessary to be too strict, as long as the data approximately obey the normal distribution. One-way ANOVA was performed using SPSS [69] to assess whether there were any statistically significant differences among the means of the three independent conditions. To establish the between-group difference, post hoc tests were ran using Bonferroni method. The mean and standard deviation of the measured variables for every experimental condition are presented in Table 2. Figure 9 shows the differences among the three conditions for various items.

Qualitative Differences
We also asked about participants' preferences for each condition at the end of study. The questionnaire that was administered to participants at the end of the experiment contained a set of open-ended questions. We found that all participants preferred the AR-based condition.
We conducted a thematic analysis on the participants' responses for AR and VR based try-on.
For AR-based try-on, we collected the comments from the participants and summarized keywords of concern about AR-based try-on. Table 3 shows the reoccurring themes and the number of participants mentioning AR-based try-on. Participants liked the AR-based try-on as it provided a real interactive environment and enabled putting the virtual avatar into a real-life scene. Some participants found the AR view was helpful for observing the garment model and manipulating the garment model was easier. AR-based try-on allows user to manipulate their virtual avatars in the real world, which improves the interactivity between the user and the virtual avatar, thus enhancing user enjoyment in the shopping process.
For VR-based try-on, we also collected various comments from the participants and summarized the keywords related to it (Table 4). Participants disliked VR-based try-on for several reasons.
The participants complained about the difficulties when observing the garment model within the VR view. This is a limitation of the fixed camera perspective as it cannot provide a good overview or a detailed view. The participants complained that the virtual avatar does not look like themselves in the VR view due to the virtual environment and lighting problems, which affected their fitting experience. Figure 9. Perceived levels of enjoyment, usefulness, worry about fit problem, convenience, purchase intention, garment visualization, and attitude to shopping technology. Note: * p < 0.05, ** p < 0.01, *** p < 0.001. Table 2. Post-hoc between-group results.

Dependent Variable Condition (A) Mean (A) Condition (B) Mean (B) Significance of the Mean Difference (A vs. B)
Perceived Enjoyment Image-  Table 3. Illustrative excerpts and main conclusions of free comments mentioning AR-based try-on.

Conclusions Free Comments
AR-based try-on Highly realistic "Light and environment in AR-based try-on are more realistic than VR-based try-on." "AR-based real environment is closer to real-life scene, making the fitting effect more real." High accuracy of human avatar "The virtual human model in VR-based try-on does not look like me." "I can pay more attention to the sense of whole body in AR-based try-on." Better 3D garment visualization "I can see the effect close up or from a distance in AR-based try-on. It is very convenient for checking the details of the garments." "I can obtain a view by looking thoroughly and examining the garments in details from different angles." High interactivity "AR-based try-on superimposed on the real world. I can interact with my model in the real world background, right next to me." "It's more interesting to interact with my own virtual avatars in a real-life scene." "It is a game-like experience, I can put my virtual body into real life and interact with it. It is very interesting for me." Table 4. Illustrative excerpts and main conclusions of free comment mentioning the VR-based try-on.

Conclusions Free Comments
VR-based try-on Higher attractiveness "The different wearing conditions give me some inspiration of how to design an outfit. It is very interesting." "AR-based try-on looks better, I also like the changing environment function in VR-based try-on. So, If they can be combined, it will be better." Low accuracy of human avatar "I think the virtual human model in VR-based try-on does not look like me, so when I am fitting clothes in VR, it is not realistic." Single perspective and lower interactivity "I can view virtual human models and clothes models in 360 degrees in AR-based try-on. The perspective of VR-based try-on is too simple. I cannot view the models from the view points that I wanted."

Testing of H1 and H2
As shown in Figure 9, we found significant differences between the image-only and AR-based try-on as well as the VR-based try-on in perceived enjoyment, usefulness, worry about the fit problem, and convenience. As hypothesized, compared to the traditional e-commerce interface, AR-and VR-based try-on create a more positive experience for the user (H1). In addition, AR performs better than VR-based VTO. AR shows positive effects in terms of enjoyment and purchase intention compared with VR-based try-on. In addition, all the participants preferred the AR-based try-on. According to the participants' comments, most thought that AR-based try-on provided better 3D visualization and more realism virtual avatars, which make the overall fitting experience more realistic than VR-based try-on (H2). We also found that users focused more on the accuracy of the model in the VR environment, whereas, users focused less on the accuracy of the model in AR, and more on the overall feel of the model within the real environment. This allows users to better evaluate how the clothes look like on their virtual body.

Experimental Design
To investigate whether personalized motion VTO can provide a more engaging shopping experience for users and thus increase their purchasing intention, we conducted a within-subject study under three conditions: (1) AR-based try-on with no motion (NM): Participants can try virtual clothes using a static virtual avatar. We invited 12 participants to participate in our studies. Each participant was asked to shop online using the system under these three conditions, and the orders of presentation were randomized. After each shopping experience, participants were asked to complete a questionnaire. Then, each participant selected their favorite condition and explained the reason for their selection.

Measures
The respondents completed the questionnaire using a 7-point Likert scale (ranging from 1, "strongly disagree", to 7, "strongly agree") adopted from existing research. The items on body similarity were adopted from Banakou [70], and purchase intention from Chandran and Moreitz [68]. We measured body similarity, usefulness, garment visualization, enjoyment, and purchase intention items under all three conditions. While movement similarity were measured only in predefined motion and personalized motion conditions. The questionnaire and measurement items are shown in Table 5. Table 5. Questionnaire and measurement items.

Measurement Items of User Study 2
Body similarity I feel that the virtual body I saw was my own body.
Movement similarity I feel that the movement of the virtual avatar was similar to my own movements.

Usefulness
I can imagine what it looks like when I am wearing clothes by performing some activities in real life.
Garment visualization Model walking in a real environment helps me know more about the appearance of the clothes.

Enjoyment
Seeing my own model in the real world makes me feel interested.

Purchase intention
The probability that I would buy the product is very high.

Results
We separated the result into two sections: (1) analysis of the ratings from the questionnaires and (2) thematic analysis of the participants' comments. •

Differences in ratings
One-way ANOVA was performed using SPSS (IBM, Illinois, USA) [69] to assess if there were statistically significant differences among the means of the three independent conditions. To establish the between-group differences, the post hoc tests were ran using the Bonferroni method. The mean and standard deviation of the measured variables for each experimental condition are presented in Table 6. Figure 10 shows the differences among the three conditions in various items.
The ANOVA result shows that significantly higher ratings in ''Usefulness" (p < 0.05), ''Movement similarity" (p < 0.001), while no significant difference is found in ''Body similarity" We did not find significant differences among the three conditions in garment visualization, enjoyment, body similarity, and purchase intention. •

Qualitative differences
The qualitative study comments from each of the participants are listed in Table 7. The participants evaluated the three systems from four aspects: garment quality, virtual body similarity, enjoyment, and personalized motion. They thought the interactivity with the garment should be improved and the material of the garments was not realistic. The personalized virtual avatar provided by our system was similar to users, especially the personalized motion offering users a better sense of fitting on the "real me", which makes the virtual avatar more realistic. As for enjoyment, some participants mentioned that the personalized motion makes them more interested in changing motions while fitting.
We also asked for participants' preferences for each condition at the end of study. Among the 12 female participants, 10 of them preferred the personalized motion condition, since it offered user a better sense of "real me", and made the virtual avatar movement more realistic. In addition, personalized motion can help users gain a sense of wearing the clothes on their own body with different motions in their daily life. Two participants preferred the predefined motion condition; they thought that the personalized motion was not as smooth as the predefined motion, especially for walking animation. If the personalized motion can be smoother and more natural, they will choose the personalized-motion VTO as their preferred condition.  Table 7. Illustrative excerpts and main conclusions in free comments.

Conclusions Free Comments
Garment quality The material of clothes is not realistic.
"The garment looks unrealistic." "The material of the clothes does not look real." Body similarity Virtual avatar with personalized motion makes the virtual avatar more similar with users.
"The virtual avatar looks like me." "The face of the virtual avatar is very similar to me." "The virtual avatar is just another me."

Enjoyment
AR-based VTO with personalized motion is enjoyable for users.
"The personalized motion condition makes me more interested in changing my avatar's motion." "Personalized motion is realistic, making me feel engaged."

Personalized motion
Although the personalized motion is not as smooth as pre-defined motion, it is more similar to users' own movements.
"The personalized motion looks more natural and realistic." "Compared with predefined motion, personalized motion offers a sense of the real me, which makes the virtual avatar's movement more realistic." "The predefined motion is standardized, which does not look similar to me." "The personalized motion is not smooth, I could gain a better understanding with a smoother motion." "With the personalized motion, I can also check the shape of clothes when the model is moving." "The AR-based VTO with personalized motion is closest to me. I feel like I am looking into a mirror." "It would be nice if the virtual avatar can have the user's facial expression as well." Figure 10. Perceived levels of body similarity, usefulness, garment visualization, purchase intention, and movement similarity. Note: * p < 0.05, ** p < 0.01.

Testing of H3
We hypothesized that AR-based try-on with personalized avatar movement would produce a positive effect on user's attitudes toward the apparel product and shopping intentions. To validate H3, we designed user study 2 to explore personalized motion and concerns about the VTO experience. In this experiment, we measured six items to assess user experience with VTO. We found significant differences (p < 0.01) in the movement similarity between the predefined and personalized motion conditions. We used motion capture technology to capture user movement, which offers a better sense of the "real me", which made the virtual avatar's movement more realistic. However, we found no direct effect of personalized motion on the attitude toward clothes and purchase intention, so H3 was not supported. According to the feedback of the participants, one of the main reasons that influenced user attitude toward the clothes was the quality of the virtual garment, as several participants responded that the material of the clothes was not realistic enough.

Discussion
In this study, we compared the effects of AR-and VR-based try-on (user study 1) and explored the effects of having personalized movement during the VTO experience (user study 2). The results from user study 1 revealed that AR-based try-on performed better than VR-based try-on, especially in the 3D representation of clothing and the fidelity of personalized human avatars. Using AR technology, the real interactive environment makes the fitness effect more realistic. In contrast, VR-based try-on provides several virtual environments for users, which can provide users a sense of wearing clothes in different conditions. However, users could be drawn to the inaccuracy of the avatar. These findings indicated that AR-based try-on shows more potential in the future of online shopping. The results from user study 2 indicated that users show a strong interest in the application of personalized motions in VTO technology. However, a high degree of personalized motion accuracy is required.

Theoretical Implications
This research provides three important contributions. First, the findings contribute to using personalized avatar for garment visualization in e-commerce. Currently, users often use a personal computer or mobile device to shop online. Using these platforms, the shopping experience is limited to 2D displays of product. These constraints make online shopping less convenient, especially when the resulting look and fit are important factors for these types of products. The main finding of this study is that a highly personalized human avatar is an important factor for realistically presenting online products and for simulating a better fit effect for the VTO experience. We found that AR-based try-on using personalized avatars can provide better 3D visualization of a product because AR can overlay virtual objects onto one's view of the real environment. Having the 3D virtual product fit the personalized human avatar in the user's real-life scene will improve the realism of garment visualization.
Second, our research enriches the literature on comparing the different roles of AR-and VR-based try-on. Previous research [71,72] investigated the impacts of AR/VR technology on the online shopping experience, and reported mixed results, but overlooked the differences in the impacts of the VTO experience between AR-and VR-based try-on. Previously, AR-based try-on has been reported to result in better and more realistic garment visualization than VR-based try-on, since AR-based environments include the real-life scenes, making the fitting effect more realistic than VR-based try-on. In addition, since AR-based try-on can be achieved on a smartphone, users can thoroughly view the 3D garment from different angles, thus providing users with a brand new product presentation experience. In this research, we found that the benefits of AR-based try-on are not always obvious. VR-based try-on with different wearing virtual scenes can allow users to try on clothes based on different virtual scenes, and may inspire some users when designing an outfit. Overall, AR performance is better.
Third, previous research proved that the virtual try-on experience with a personalized virtual avatar can create a positive attitude toward shopping technology and increase purchase intention [73][74][75]. Our research enriches the literature on the impact of personalized motion during the virtual try-on experience. Although we found that virtual try-on with personalized motion did not create a better attitude toward purchase intention and garment product, we found two potential influences of personalized motion on virtual try-on experience. Firstly, personalized motion provides a better sense of "true fit" due to the high similarity between user and avatar movements. As discussed in previous research, the success of traditional product presentation with a virtual avatar depends on similarity of the appearance of the virtual avatar to the real user [10,75,76]. We proposed another aspect of avatar similarity, in addition to the appearance of the avatar and the body shape, we also include the movement of users. Secondly, virtual try-on with personalized motion may offer consumers a helpful and realistic experience. Personalized motion provides users with a more engaging interaction with a virtual product and avatar, and offers a more enjoyable experience to communication-sensitive users when shopping online [77]. Overall, this research offers a better understanding of the impact of personalized motion during the virtual try-on experience. Compared with previous research, we proposed a highly personalized virtual try-on system that is different from the four virtual try-on systems mentioned by Merle et al. [10]. We have opened up a new direction for personalized virtual try-on systems, and have increased the possibilities for increasing online shopping enjoyment, and enhancing the virtual fitting experience for users.

Practical Implications
Our findings offer new insights for e-commerce, especially for fashion items. For companies, our results suggest that AR-based try-on systems on mobile phone can be easily used by customers, which can significantly increase customer shopping enjoyment and decrease the risk of returning products. Using AR or VR technology, companies can display a more detailed product using 3D visualization, which allows users to view the fashion products from different directions and angles. Retailers may focus on the presentation of 3D garment models on virtual human models in real-life scenes using AR technology.
For customers, our systems can provide customers a better understanding of how the product will look on them. Based on some 3D modeling technology, users can easily create their virtual avatar on the basis of the image from smartphone cameras (such as 3DLOOK). These kind of 3D avatar generation technique will increase consumer convenience when examining products before purchasing. The try-on experience using a personalized virtual avatar provides customers a sense of "how it would look" so that they can better judge whether clothes will be suitable for them, thereby increasing consumer satisfaction with online shopping.

Limitations and Future Research
This research has limitations that could be addressed in the future studies. Some limitations are related to the motion capture. In the current stage of research, we used Kinect to record users' personalized motions. Since the detection of human bones by Kinect sensor is limited and the human body joint tracking is not accurate, the resulting personalized human motion is not as smooth as the predefined motions. Therefore, future research may use more advanced motion capture equipment to improve the accuracy of human motion. Another limitation is related to the avatar generation technique. At the current stage, we used two tools to generate a 3D body model and face model based on users' images separately, thus we integrate these two model together. This kind of method need some manual adjustment. Therefore, future work may focus on how to generate a complete and precise virtual avatar based on the users images.
Our future research may focus on a fully personalized try-on experience. Garment fit on the personalized virtual avatar may help consumers ascertain fit, thus improving their confidence in purchase choice. We expect to use a virtual avatar that has body motions or even facial expressions in future VTO. With facial expression detection technology, a virtual avatar may have human-like expression, thus improving the fidelity of human avatar.
Finally, as VR-and AR-based try-on systems have different positive effects on the online shopping experience, further research may find a new way of combining VR and AR technology in VTO experience, which may help build a fully immersive and interactive interface. AR/VR technology allows for the possibility of creating a fully real/virtual environment. The fully immersive experience using AR/VR provides users a unique and innovative experience. The results from user study 1 showed that AR produces a better effect on VTO than VR. However, the result also showed that the virtual environment in VR-based try-on gives users more immersive shopping experience, thus inspiring users in terms of outfit design based on different wearing conditions. Therefore, further research may combine the advantages of AR and VR technology and discover a new method that may improve different aspects of the online shopping experience and overall customer satisfaction. For instance, using AR-based try-on, some virtual components in the real scene could be combined to simulate a variety of different virtual scenes, and then help users to try on clothes, based on different wearing conditions.

Conflicts of Interest:
The authors declare no conflict of interest.

Abbreviations
The following abbreviations are used in this paper: