Next Article in Journal
Automatic Method for Distinguishing Hardware and Software Faults Based on Software Execution Data and Hardware Performance Counters
Next Article in Special Issue
A Novel Motion Intention Recognition Approach for Soft Exoskeleton via IMU
Previous Article in Journal
A Multiplicatively Weighted Voronoi-Based Workspace Partition for Heterogeneous Seeding Robots
Previous Article in Special Issue
Motor-Imagery Classification Using Riemannian Geometry with Median Absolute Deviation
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Comparing VR- and AR-Based Try-On Systems Using Personalized Avatars

Information, Production and Systems IPS, Waseda University, Kitakyushu, Fukuoka 8080135, Japan
Rakuten Institute of Technology, Rakuten, Inc., Tokyo 1580094, Japan
Authors to whom correspondence should be addressed.
Electronics 2020, 9(11), 1814;
Received: 26 September 2020 / Revised: 23 October 2020 / Accepted: 27 October 2020 / Published: 2 November 2020
(This article belongs to the Special Issue Human Computer Interaction and Its Future)


Despite the convenience offered by e-commerce, online apparel shopping presents various product-related risks, as consumers can neither physically see nor try products on themselves. Augmented reality (AR) and virtual reality (VR) technologies have been used to improve the shopping online experience. Therefore, we propose an AR- and VR-based try-on system that provides users a novel shopping experience where they can view garments fitted onto their personalized virtual body. Recorded personalized motions are used to allow users to dynamically interact with their dressed virtual body in AR. We conducted two user studies to compare the different roles of VR- and AR-based try-ons and validate the impact of personalized motions on the virtual try-on experience. In the first user study, the mobile application with the AR- and VR-based try-on is compared to a traditional e-commerce interface. In the second user study, personalized avatars with pre-defined motion and personalized motion is compared to a personalized no-motion avatar with AR-based try-on. The result shows that AR- and VR-based try-ons can positively influence the shopping experience, compared with the traditional e-commerce interface. Overall, AR-based try-on provides a better and more realistic garment visualization than VR-based try-on. In addition, we found that personalized motions do not directly affect the user’s shopping experience.

1. Introduction

With the continuous development of internet and mobile technology, apparel e-commerce is rapidly expanding worldwide. Nowadays, the number of consumers purchasing clothes online is increasing [1]. However, approximately $62 billion worth of returns occur annually in the fashion industry [2]. In addition, almost 56% of consumers reported apprehension about physical garment style and fit as major sources of concern with online shopping. The majority of online shopping websites display garments through 2D photos of garments and human models. Consumers purchasing decisions will mostly depend on the 2D images of garments, without being able to look at the piece of clothing in detail or better understand the garment style. Apparel is highly connected with the physical body figure of an individual. On typical online shopping websites, which is a purely digital environment, consumers are unable to physically try on clothes. Discrepancies may exist between the actual products and the perceived body sizes, making it difficult for consumers to determine fit [3], which may negatively impact their shopping experience and purchase intention [4,5].
Today, with the growth of the virtual try-on (VTO) technology, consumers can browse a broad range of products and try items on in the online shopping environment. VTO technology provides a virtual try-on experience by simulating the consumer’s physical figure with virtual models, based on human measurements [6]. VTO can meet the various needs of consumers and assist consumers to accurately assess fit and size in online shopping environment [6,7].
Potential concerns with the VTO simulations, such as accuracy, may give rise to consumer dissatisfaction with the fitting results, thus VTOs have not been widely adopted to date [7,8]. The poor representation of consumer’s actual body and the lack of face representation may lead to an unrealistic perception of VTO, thus strongly affecting the quality of the consumer shopping experience. Therefore, prior work has concentrated more on accurately simulating human representation. The low level of accuracy could be partly due to the lack of advanced 3D avatar technology. Some VTO used a 3D virtual avatar created by reflecting body measurements and facial features of users [9]. This 3D virtual avatar may increase the accuracy of virtual fitting and enhance the hedonic experience while shopping [10]. Furthermore, when trying on clothes in real life, consumers may pose in front of the mirror to further assess the fit and style of the garments. Perception features, such as personalized motion, may also influence the perceived accuracy. However, most VTOs emphasize physical body measurements to personalize a virtual self to create a physical fitting experience that is as accurate as possible, thus overlooking the impact of personalized motion in the try-on experience.
Recently, AR and VR technologies have been applied in various fields, such as travel, education, entertainment, training and fashion [11,12,13]. AR technology refers to the overlay of virtual 3D objects onto one’s view of the physical real environment [14]. AR in online shopping apps has been also gaining momentum—for example, eyeglasses or jewelry try-on mobile apps. These apps helped the customers try on different eyeglasses or jewelry without actually putting them on physically [15]. Using AR technology, VTO enables consumers to try a number of products augmented on a mirror image of themselves via a digital display with a camera directed at them. This is commonly called a “magic mirror” [16]. A number of fashion firms employed AR fitting rooms in the form of a magic mirror, including Uniqlo, Gap, etc. However the magic mirror is usually used for real shops. Therefore, some fashion brands, such as RIXO London and Tommy Hilfiger, allowed consumers to perform virtual fitting and view the virtual catwalk with the use of a mobile app [17] using VR technology, which created a realistic-looking world in an entirely synthetic environment using computer graphics. With this type of VTO, consumers can feel like they are present in the simulated fitting room [18]. VR fitting has the potential to alter the consumer experience in a new way because it can provide a more immersive fitting experience. Until now, the response of consumers toward VTO during their shopping experience has shown mixed results, and the variation between AR- and VR-based try-ons has not been examined.
In this paper, we discuss e-commerce, VR/AR technologies, and how AR/VR may create value for online retailers by overcoming the limitations of traditional e-commerce. We also examine how personalized motion can contribute to the consumer shopping experience.
To validate how AR/VR may contribute to online shopping, we compare the roles of AR and VR in the VTO experience. To understand how personalized motion is related to consumer attitudes toward the shopping experience, we developed an interactive VTO system that supports personalized interactions through a smartphone. Users can view their dressed body in the real world from different viewpoints. In addition, we propose that the virtual body model includes the user’s own personalized motion. This allows users a better understanding of whether the garment is suitable for them while moving in their own unique way. The system implements interaction techniques across the AR and VR technologies and demonstrates how each can be used for online shopping tasks. To gather feedback on the AR/VR try-on interaction techniques and personalized motion, we ran two user studies that provided insights, observations, and guidance for future work.

2. Background and Related Work

2.1. AR- and VR-Based Try-On

VTO is one form of image interactivity technology (IIT) [10] that simulates users’ online shopping experience by using AR or VR technology. Previous research on VTO can be divided into AR-based virtual try-on and VR-based virtual try-on.

2.1.1. AR-Based Try-On

AR is a computer-simulated interactive technology that enriches the user experience by integrating additional information into the user’s real world. AR provides users with more realistic self-descriptive product experiences. Previous research studied two types of AR VTO: (1) 2D Overlay AR-based VTO superimposed images of products onto users’ body; (2) 3D AR-based VTO provides 3D visualization of products onto user’s body.
  • 2D Overlay AR-Based Try-On
    Earlier work on 2D overlay virtual try-on was mostly conducted in computer graphics [19,20,21]. The 2D overlay VTO overlays a projected 2D image of products onto an image of the user and the real environment around the user. Hilsmann et al. re-textured garment overlay for real-time visualization of garments in a virtual mirror environment [22]. Yamada et al. proposed a method for reshaping the garment image based on human body shapes to make fitting more realistic [23]. Zheng et al. aligned the target clothing item on a given person’s body and presented a try-on look of the person with different static poses [24]. However, 2D overlay virtual try-on does not adapt well to dynamic poses when users perform certain activities. In addition, like many other re-texturing approaches, they operate only in 2D without using 3D information in any way, which prevents users from being able to view their virtual self from arbitrary viewpoints.
  • 3D AR-Based Try-On
    Compared to 2D images, 3D garment models precisely simulate garments. Previous research focused on matching the 3D virtual garment to user body shape or virtual avatar. Pons-Moll et al. introduced a system using a multi-part 3D model of clothed bodies for clothing extraction and re-targeting the clothing to new body shapes [25]. Zhou et al. generated 3D models of garments from a single image [26]. Yang et al. generated garment models by fitting 3D templates on users’ bodies [27]. Duan et al. proposed a multi-part 3D garment model reconstruction method to generate virtual garment for virtual fitting on virtual avatars [25].
In general, the 3D AR-based try-on is better than 2D overlay AR-based try-on. Because the 3D garment model matching to 3D avatars presents a more accurate representation of the garment and its fit. It also provide users a multi-angle view of the garments. In our research, we propose a method for users to view the fitting interactively, and enable users to check the garments by augmenting the motion of personalized avatar in the real world.

2.1.2. VR-Based Try-On

Virtual reality technology generates a fully virtual environment by utilizing computer graphics. Recently, using the VR headset, a VR-based try-on system provides a completely immersive shopping experience for users [28]. Users can feel like they are presenting in a simulated fitting room. The virtual shopping experience provided by Alibaba’s Buy+TM and DiorEyeTM can be examples [17]. At the current stage, customers can see static and dynamic 3D looks of a virtual model wearing different clothes [28] and check the fitting based on different virtual scenes [29]. Some other researchers tried to provide a real-time walking animation in a virtual fitting room [30]. Through a virtual walk in VR, users can get a realistic view by thoroughly watching and examining the products with details from different angles [17].
The VR-based try-on technology can provide a more immersive and real experience. We believe that it has great potential to change the consumer experience in new ways. In this paper, we create a personalized VR-based try-on system, user can watch and examine the fitting with their own models from different angles.

2.2. Personalized Virtual Avatar on Concerns with Virtual Try-On

The main problem with online shopping is the lack of direct try-on experience, which may lead to an increase in perceived risk of purchases due to the difficulty in judging the product fit [31]. Some literature provide virtual fitting experiences on a default virtual avatars rather than one generated based on the user’s own body [32,33]. The absence of “how they fit” may influence customer purchase intention when shopping online and decrease consumer enjoyment in the shopping process.
  • Personalized Virtual Avatar
    Compared to standardized avatars, customizing a virtual avatar with their own features may provide users a sense of virtual self. The creation of realistic self-avatars is important for VR- or AR-based VTO applications that aim for improving the acceptance of personalized avatars, thus providing users a more realistic and accurate try-on experience. Recent studies focused on investigating the personalized VTO experience with a customized virtual avatar created using the user’s own face and body figures. Personalized VTO provides a more realistic user experience when users try clothes on their virtual self [10]. Yuan et al. customized a partially visible avatar based on the user’s body size and skin color, and used it for proper clothes fitting. They found that using a personalized avatar can increase customer purchase choice confidence [34]. Nadia Magnenat-Thalmann et al. proposed a VTO system that allows users to virtually fit physically-simulated garments on their generic body model [35]. Yang and Xiong found that a VTO experience with a personalized avatar significantly increases customer satisfaction and decreases the rate of product return [36]. Moreover, with females as the main target customers of online shopping, they usually have a high body esteem of their virtual body. Anne Thaler et al. investigated gender differences in the use of visual cues (shape, texture) of a self-avatar for estimating body weight and evaluating avatar appearance. In terms of the ideal body weight, females but not males desired a thinner body [37].
  • Virtual Try-On Levels of Personalization
    Depending on the avatar’s level of personalization, the avatar representing the user may or may not provide a real sense of self [38]. According to the avatar’s similarity to the user, virtual try-on systems can be divided into four levels, as proposed Merle et al. [10]:
    Mix-and-match: Same as the traditional online shopping where users can select the products using only online images.
    Non-personalized VTO: Some virtual try-on experiences based on a default virtual avatar rather than one generated from the user’s own body [29,39,40]. The lack of precision in describing users and products reduces users’ virtual try-on experience.
    Personalized VTO: Virtual avatar models are customized with personal features (face color, height, weight, bust size, and body shape).
    Highly personalized VTO: Virtual avatar models are customized with personal features, including the face model.
The highly personalized VTO requires more personalized information, which leads to better information recall for users. Users can gain a better understanding of wearing clothes on their own body, increasing their purchase choice confidence [41,42,43,44].
In this paper, we propose a highly personalized motion VTO system. We customized the virtual avatar based on the user’s own face and body figures. We personalized the user’s actual posture or movement as virtual avatar animation.
Comparing with the existing system, our proposed AR and VR try-on system tries to make the improvement from these three parts:
Personalization: Existing VTO systems only personalized user’s presence of the real body (body shape and face image). Our proposed AR- and VR-based try-on system improves the level of personalized avatars. We provide a new way for users to view the virtual garment by the augmented personalized motion.
Interactivity: Most existing VTO systems only overlay the 2D images of garment onto the user’s real body, without using 3D information and not allowing users to check the garment from different viewpoints. Our proposed AR- and VR-based try-on system enables users to view a life-size personalized avatar with garment models and posing or walking augmented in the real-world. Users can view the virtual garment interactively and immersive in 360 degrees.
Realism of garment model: Existing VTO systems usually use some pre-defined garment model. Our proposed AR- and VR-based try-on system generates a 3D garments model based on some existing website information.

3. System Design

To provide a better virtual try-on experience, we explored the relationship between physical try-on and 3D virtual try-on (Figure 1). When customers shop in a physical store, their experiences can be divided into several categories: human activity, human body, garment, and environment. We describe how our system contributes within each of these categories.
  • Human Activity: Daily life activities and motions.
    When trying on clothes physically, consumers often like to move their body in such a way that mimics their daily life poses or actions in order to observe the dynamic details of the clothes. We prepared several natural daily life motions to simulate the consumers’ activities in the real world, such as walking, sitting, waving, etc.
  • Human Body: The basic human body property information such as sex, face, height, and body shape.
    During physical fitting, consumers have the opportunity to actually try the garments on and choose clothes that suit their own body. With 3D virtual try-on, we created a personalized virtual avatar based on the user’s body figures, face photos, as well as users’ personalized actual posture or movement.
  • Garment: Clothing style, clothing fit, and garment type. For 3D virtual try-on, we generated a 3D garment model library for users based on the garment image information from existing online shopping websites.
  • Environment: Environmental conditions of the try-on experience.
    During physical fitting, consumers can actually try on clothes in the real world or wear the clothes under different conditions, such as walking in the street, working in the office, etc. Using VR technology, the 3D virtual try-on incorporates several virtual scenarios, simulating different physical scenes, which can help users make decisions according to different wearing conditions.
In these ways, we propose a 3D virtual try-on system using personalized avatars to simulate the physical try-on experience as accurately as possible, based on these categories. With our system, users can view their own life-size personalized virtual body with garment models, posing or walking augmented in the real world. Our system consists of three main stages: personalizing avatar, garment model generation, and 3D virtual try-on.

3.1. Personalized Avatar

The presence of a human model during product presentation activates the effect of self-referencing [45]. For instance, female consumers tend to compare themselves with human models and evaluate their perceived similarities based on ethnicity and body size [46]. Therefore, the similarity of the virtual human avatar to the consumers themselves can directly affect their experience in assessing the details of the garments, such as color, fabric, style, and the fit on their virtual models [38,47,48].
The virtual human body should have an appropriate 3D representation corresponding to the real user’s body, face features, and body movement. Daily life motions, such as walking and sitting, may potentially provide users a lifelike shopping experience. Since each user has their own style of movement and posture, the similarity of the virtual avatar’s motion compared to the users real motion may influence user purchasing confidence. Therefore, we proposed that virtual body models should include users own personalized motion, which are captured individually by each user. In this section, we discuss in detail the process of human model generation and the personalized motion of the avatar.

3.1.1. Human Model Generation

To allow users to assess how well the displayed products match their actual body, we personalized users’ own virtual avatar corresponding to the real user’s human body shape and face features.
Figure 2 provides an overview of human model generation, which consists of two main stages: face model generation and body model generation. We generated the 3D body shape model of the user using 3DLook software (3DLOOK, San Mateo, CA, USA) [49] and generated the 3D face model of users using 3D Avatars SDK (AVATAR SDK, Santa Clara, CA, USA) [50].
  • Face model generation
    3D Avatars SDK [50] combines complex computer vision, deep learning, and computer graphics techniques to turn 2D photos into a realistic virtual avatar. Using 3D Avatars SDK, we created a 3D face model from a single frontal facial image, as well as a fixed topology of a head that included user hair and neck.
  • Body model generation
    3D LOOK [49] is 3D body model generation software. To obtain a 3D body model of the users, we collected the user’s basic information about their body: height and weight. As shown in Figure 2, two full front and side body photos were used as inputs, and the 3D virtual avatars can be obtained.

3.1.2. Personalize Motion of the Avatar

To allow users to gain a sense of the real fit of the clothing on themselves, we personalized their virtual avatar. However, when users try on clothes in an offline shop, they may perform certain activities (e.g., sitting, walking, posing, etc.) to check whether the clothes are suitable. Previous research provided virtual try-on with motions. Gültepe et al. provided a realistic fitting experience with customized static poses using a depth sensor [51]. Adikari et al. introduced a virtual dressing room for real-time simulation of 3D clothes with users performing various poses [52]. These methods did not allow users to view clothes that matches their body from arbitrary angles. So far, there is a lack of research exploring the dynamic VTO experience with personalized motions.
We personalized the animation of the virtual avatar with user’s movements, which allows users to gain a sense of wearing clothes on their own body with their own poses. The work flow of personalizing the avatar motion is shown in Figure 3, which consists of three sections: motion capture, personalized movement, and animation library. To gather users’ individual movements, we use Kinect V2 depth sensor (Microsoft, Redmond, WA, USA) [53] to record postures and user movements and create their own animation library for our system. The recorded animations are then smoothed out using Maya (Autodesk, Mill Valley, CA, USA) [54], and are then attached to the user’s virtual avatar.

3.2. Garment Model Generation

Compared with the traditional method of displaying garments on shopping websites, 3D garment models can enhance product presentation and help consumers to better visualize garments. 3D models for fashion items has many applications in VR, AR, and computer-aided design (CAD) for the apparel industry [38].
Our approach uses garment image information from existing online shopping websites to create a virtual garment library. Textures were extracted from the garment images and mapped onto the 3D garment model in 3Ds max (Autodesk, Mill Valley, CA, USA) [55].

3.2.1. 3D Garment Model Templates

Marvelous designer (CLO Virtual Fashion, New York, NY, USA) [56] is a popular 3D software used for 3D garment design based on 2D sewing patterns. Marvelous designer includes a garment template library that is used as the basis for creating various garment models. We customized several 3D garment model templates for the personalized human model using Marvelous designer [56]. We provided some 3D clothing templates of t-shirts, skirts, shirts, dresses, pants, etc. (Figure 4) based on the user’s preference.

3.2.2. Texture Mapping

We collected garment images from existing shopping websites, such as H&M [57] and ZARA [58], and mapped these clothing images to generate 3D garment model templates (Figure 4).

3.3. 3D Virtual Try-On

Our system was developed using Unity 3D on Windows 10 and we deployed our system on Android smartphones. To compare the different influences of virtual try-on experiences between AR-and VR-based try-ons, we design two systems that differ in the fitting environment (virtual environment vs. real environment), as shown in Figure 5. VR-based try-on displays 3D garment models that allow users to view their personalized human model wearing garments in a virtual environment; AR-based try-on allows users to manipulate their personalized model wearing garments into a real environment.

3.3.1. VR-Based Try-On

Environmental conditions are some of the factors that influence the shopping experience, since consumers make decisions according to the trying on experience. We constructed several virtual scenes that simulate wearing the selected clothes in variety of settings, such as on the street, in the office, and at the supermarket. Users can view the virtual garment based on the different virtual scenes, providing users a more realistic image of what they would look like in various occasions or for various purposes (Figure 6).

3.3.2. AR-Based Try-On

With AR-based try-on, consumers can see themselves wear garments in their real-life setting. Users can detect the ground plane and place the user’s virtual body into a real-life scene, in life size. Thus, users can view a life-size personalized virtual body with garment models with augmented posing or walking in the real world.
To understand the role personalized actions play in the AR-based try-on experience, we designed three scenarios for validation: 1. No-motion VTO: Users can choose various styles of clothing on the left, and the 3D avatar on the right will show the effect of the clothes, displayed statically. 2. Predefined-motion VTO: This system enables users to match clothes on their personalized virtual avatar with pre-defined animation, and dynamically view the effects. 3. Personalized-motion VTO: This system enables users to match clothes on their virtual avatar with personalized motion, and dynamically view the effects.

4. Implementation

4.1. Hardware Overview

We used a Google Pixel 3 (Google, Mountain View, CA, USA) smartphone and Kinect Xbox (Microsoft, Redmond, WA, USA) [53] as our primary hardware components. To capture motion and skeleton information for each user, we used a Microsoft Kinect V2 sensor as the depth sensor, and tracked their body movement. The depth sensor is connected to a computer with Windows 10 operating system using a USB 3.0 controller and a Kinect adapter for Windows.

4.2. Software Overview

  • Development Tools
    We developed our software using the Unity game engine (Unity Technologies, San Francisco, TX, USA) [59] version 2019.1.14f1 with Vuforia SDK and Cinema Mocap for Unity. Vuforia SDK was used to recognize the ground plane and create an AR experience. Cinema Mocap [60] was used to convert body tracking data to animation clips.
  • Personalized Motion
    To convert the captured mo-cap data into animation, we used a Unity plugin, Cinema-Mocap, which is a marker-less motion capture solution for Unity to create customized animations for users. As the movement captured by Kinect V2 depth sensor is quite jittery, we edited and smoothed the animation frame-by-frame in Maya [54], which is a 3D computer animation, modeling, simulation, and rendering software. The animations are then imported into Unity and the animations are attached to the virtual avatars.
    Our framework for motion capture using Kinect is shown in Figure 7. The movement of the users in the real world are converted into the animation of the avatar in the virtual world using these three steps:
    Record the movements of users: We recorded users’ movements using a Kinect V2 depth sensor and Cinema Mocap in Unity. Kinect V2 sensor was used as a skeleton camera to track users’ body motion; Cinema Mocap in Unity was used to convert the motion capture data to avatar animation.
    Smooth out animation: The animation captured by the Kinect V2 sensor contains trembling movements, which we smoothed out in Maya.
    Attach animation to virtual avatar: Before attaching the animation to the avatar, we needed to rig the skeleton and the skin to the virtual avatar using Mixamo [61]. Then, we used an animation controller in Unity to control the virtual avatar to perform humanoid and realistic animation.
    Following these three steps, users can view the virtual avatar with their personalized motion inside the VR or AR environment.
  • Garment Model Generation
    Previous research on garment modeling started from the 2D design pattern or 2D sketches. While other methods explored garment resizing and transfer from 3D template garments [27]. Compared to 2D design patterns, 3D garment models simulate the garment more precisely. Therefore, our method extends these methods to map the 2D image to the 3D garment models. The texture-mapping method increases the realism of the garment.
    We generated 3D garment models from images on online shopping websites, such as H&M and ZARA. We used Marvelous designer [56] to create a 3D garment model. The texture mapping method is shown in Figure 8. The 3D mesh of a generated garment template can be flattened into a 2D texture UV map in 3Ds Max. The 2D UV map contains several parts. For example, in the case of a T-shirt, it can be segmented into three parts: sleeves, front, and side. To map the web garment image to a 3D virtual garment template, we mapped the different segmentation parts from the garment image to its corresponding parts on the garment template. In this way, we can generate the 3D textured garment model.

5. Evaluation

5.1. Goal

Both AR and VR can provide users with an enriched and immersive experience during virtual try-on. Specifically, when compared to traditional online shopping websites, VR and AR are perceived as more useful for users since they can provide a fuller visualization of the look and feel of the product. However, it is difficult to ascertain which technology is better for virtual try-on. Virtual avatars with personalized human motion increase avatar identification and produce high avatar similarity, which may offer a better understanding of users’ own selves and improve their overall body satisfaction. High similarity of virtual avatars enhances the user attitude toward the body-involving product and the perceived usefulness of the avatar [36,62].
Therefore, the goal of our research is (1) To validate how AR/VR-based try-on system may contribute to online shopping. (2) To understand how personalized motion is related to consumer attitudes toward the shopping experience.
Based on the goal of our system, we hypothesized that:
Hypothesis 1 (H1).
AR- and VR-based try-on technologies are better than traditional online shopping and provide consumers with positive experiences.
Hypothesis 2 (H2).
Comparing with VR-based try-on, AR-based try-on technology provides a better representation of the product and shows how it fits the environment.
Hypothesis 3 (H3).
AR-based try-on with personalized human motion can provide a more engaging shopping experience and thus increase their purchasing intention.
Two experiments (user study 1 and user study 2) were designed to test our hypotheses. User study 1 was conducted to examine the differences between the roles of VR-based try-on and AR-based try-on (H1, H2). User study 2 was designed to investigate the impact of personalized movement on the virtual try-on experience (H3).

5.2. Participants

A total of 12 college students participated in both user studies 1 and 2 to test our proposed hypotheses. Participants were offered extra credit as an incentive to complete the study. According to Merle, there is no significant gender difference in the overall VTO adoption process [10]. However, women represent the largest online apparel segment, and they are more sensitive to online shopping [63,64]. Adults aged 18–30 years are usually the target users for AR or VR applications, as they are more likely to try new technologies and they are proactive in online shopping for fashion products. Hence, we invited 12 women (mean age = 22.9 years, SD = 1.62) to participate in the experiment.
Before the experiment, we administered a pre-questionnaire to each participant to assess their online shopping experience and experience with AR or VR technology. We gathered the responses to pre-questionnaire from all 12 female participants. We found that all participants had the habit of shopping online every month. Most participants had more than four years of online shopping experience in buying clothes. Eleven had encountered the fit problem when buying fashion products online. Most participants had some knowledge of the VTO experience and had previously experienced VR/AR. We investigated the participants’ clothing preferences and found that the type of clothing most popular is T-shirt, followed by dress, short skirts, and coat. Therefore, we prepared clothes templates according to their clothing preferences.

5.3. User Study 1: Exploration of VR- and AR-Based Try-On

5.3.1. Experimental Design

In order to verify hypotheses H1 and H2, and to understand consumers’ attitudes toward the VR-based try-on, AR-based try-on, and the traditional image-only e-commerce interface, we conducted a within-subject study under three conditions:
Image-only (IO): Participants completed a simulation of an online shopping experience using garment pictures only. They can only imagine what they would look like wearing the clothes in their mind.
VR-based try-on (VR-TO): Participants completed a simulation of an online shopping experience with Virtual Reality. They can try virtual clothes on their static virtual avatar.
AR-based try-on (AR-TO): Participants completed a simulation of an online shopping experience with Augmented Reality. They can try virtual clothes on their static virtual avatar, and view the virtual avatar in our real life scene.
In this study, a total of 12 participants were asked to complete all three conditions, and the order of presentation were randomized. After each shopping experience, participants were asked to complete a questionnaire. We also added various open-ended questions to capture participants’ reflections on AR- and VR-based try-on technology.

5.3.2. Measures

The respondents completed the questionnaire using a 7-point Likert scale (ranging from 1, “strongly disagree”, to 7, “strongly agree”) adopted from existing research. The items on perceived enjoyment were adapted from Childers et al. [65]. Those on perceived purchase intention were from Chandran & Moreitz [66], perceived attitude toward the shopping technology from Chen et al. [67], and perceived usefulness of the shopping technology were from Davis [68]. We measured enjoyment, convenience in examining the product, worries about the fit problem, usefulness, and purchase intention items in all three conditions. In addition, we measured garment visualization and attitude toward the shopping technology items in both VR- and AR-based try-on conditions. The questionnaire and measurement items are shown in Table 1.

5.3.3. Results

We separated the results into two sections: (1) analysis of the ratings from the questionnaires and (2) thematic analysis of the participants’ comments. The overall results of the analysis showed that the AR-based try-on preformed the best, being the preferred choice for participants. The worst perceived was the traditional online shopping interface.
  • Differences in Ratings
    For statistical analysis of differences, we used repeated measures one-way ANOVA. The study we conducted meets almost all the assumptions required for one-way ANOVA repeated measures. As for the normality assumption, it is not necessary to be too strict, as long as the data approximately obey the normal distribution. One-way ANOVA was performed using SPSS [69] to assess whether there were any statistically significant differences among the means of the three independent conditions. To establish the between-group difference, post hoc tests were ran using Bonferroni method. The mean and standard deviation of the measured variables for every experimental condition are presented in Table 2. Figure 9 shows the differences among the three conditions for various items.
    The ANOVA result shows that significantly higher ratings in ’‘Enjoyment’ (p < 0.001), ‘’Convenience’’ (p < 0.001), ‘’Usefulness’’ (p < 0.001), ‘’Worry about fit problem’’ (p < 0.001), ‘’Purchase intention’’ (p < 0.05), while no significant difference is found in ‘’Garment visualization’’ (p = 0.08), ‘’Attitude to shopping technology’’ (p = 0.45). Post hoc tests showed significant differences between the image-only and AR-based try-on (M I O = 4.17 vs. M A R T O = 6.00, p < 0.001), as well as the VR-based try-on (M I O = 4.17 vs. M V R T O = 5.25, p < 0.05) in enjoyment. We found significant differences between the image-only and AR-based try-on (M I O = 3.83 vs. M A R T O = 5.92, p < 0.001), as well as the VR-based try-on (M I O = 3.83 vs. M V R T O = 5.08, p < 0.01) in usefulness. We also analyzed whether users worried about the fit problem after using this kind of shopping technology. The image-only condition differed from the AR-based try-on (M I O = 5.58 vs. M A R T O = 3.00, p < 0.001) and VR-based try-on (M I O = 5.58 vs. M V R T O = 3.58, p < 0.001) in terms of worry about fit problem. The image-only condition also differed from the AR-based try-on (M I O = 3.17 vs. M A R T O = 6.00, p < 0.001) and VR-based try-on (M I O = 3.17 vs. M V R T O = 5.75, p <0.001) in convenience. In addition, we found significant differences between the image-only and AR-based try-on in purchase intention (M I O = 4.25 vs. M A R T O = 5.42, p < 0.001).
  • Qualitative Differences
    We also asked about participants’ preferences for each condition at the end of study. The questionnaire that was administered to participants at the end of the experiment contained a set of open-ended questions. We found that all participants preferred the AR-based condition. We conducted a thematic analysis on the participants’ responses for AR and VR based try-on.
    For AR-based try-on, we collected the comments from the participants and summarized keywords of concern about AR-based try-on. Table 3 shows the reoccurring themes and the number of participants mentioning AR-based try-on. Participants liked the AR-based try-on as it provided a real interactive environment and enabled putting the virtual avatar into a real-life scene. Some participants found the AR view was helpful for observing the garment model and manipulating the garment model was easier. AR-based try-on allows user to manipulate their virtual avatars in the real world, which improves the interactivity between the user and the virtual avatar, thus enhancing user enjoyment in the shopping process.
    For VR-based try-on, we also collected various comments from the participants and summarized the keywords related to it (Table 4). Participants disliked VR-based try-on for several reasons. The participants complained about the difficulties when observing the garment model within the VR view. This is a limitation of the fixed camera perspective as it cannot provide a good overview or a detailed view. The participants complained that the virtual avatar does not look like themselves in the VR view due to the virtual environment and lighting problems, which affected their fitting experience.

5.3.4. Testing of H1 and H2

As shown in Figure 9, we found significant differences between the image-only and AR-based try-on as well as the VR-based try-on in perceived enjoyment, usefulness, worry about the fit problem, and convenience. As hypothesized, compared to the traditional e-commerce interface, AR-and VR-based try-on create a more positive experience for the user (H1). In addition, AR performs better than VR-based VTO. AR shows positive effects in terms of enjoyment and purchase intention compared with VR-based try-on. In addition, all the participants preferred the AR-based try-on. According to the participants’ comments, most thought that AR-based try-on provided better 3D visualization and more realism virtual avatars, which make the overall fitting experience more realistic than VR-based try-on (H2). We also found that users focused more on the accuracy of the model in the VR environment, whereas, users focused less on the accuracy of the model in AR, and more on the overall feel of the model within the real environment. This allows users to better evaluate how the clothes look like on their virtual body.

5.4. User Study 2: Exploration of the Personalized Motion on Concern About VTO

5.4.1. Experimental Design

To investigate whether personalized motion VTO can provide a more engaging shopping experience for users and thus increase their purchasing intention, we conducted a within-subject study under three conditions:
AR-based try-on with no motion (NM): Participants can try virtual clothes using a static virtual avatar.
AR-based try-on with pre-defined motion (PDM): Participants can try virtual clothes using a virtual avatar with pre-defined animation.
AR-based try-on with personalized motion (PM): Participants can try virtual clothes using a virtual avatar with pre-recorded animation. The movements of each participant are captured by Kinect.
We invited 12 participants to participate in our studies. Each participant was asked to shop online using the system under these three conditions, and the orders of presentation were randomized. After each shopping experience, participants were asked to complete a questionnaire. Then, each participant selected their favorite condition and explained the reason for their selection.

5.4.2. Measures

The respondents completed the questionnaire using a 7-point Likert scale (ranging from 1, “strongly disagree”, to 7, “strongly agree”) adopted from existing research. The items on body similarity were adopted from Banakou [70], and purchase intention from Chandran and Moreitz [68]. We measured body similarity, usefulness, garment visualization, enjoyment, and purchase intention items under all three conditions. While movement similarity were measured only in predefined motion and personalized motion conditions. The questionnaire and measurement items are shown in Table 5.

5.4.3. Results

We separated the result into two sections: (1) analysis of the ratings from the questionnaires and (2) thematic analysis of the participants’ comments.
  • Differences in ratings
    One-way ANOVA was performed using SPSS (IBM, Illinois, USA) [69] to assess if there were statistically significant differences among the means of the three independent conditions. To establish the between-group differences, the post hoc tests were ran using the Bonferroni method. The mean and standard deviation of the measured variables for each experimental condition are presented in Table 6. Figure 10 shows the differences among the three conditions in various items.
    The ANOVA result shows that significantly higher ratings in ’‘Usefulness’’ (p < 0.05), ‘’Movement similarity’’ (p < 0.001), while no significant difference is found in ‘’Body similarity’’ (p = 0.89), ‘’Enjoyment’’ (p = 0.76), ‘’Garment visualization’’ (p = 0.87), ”Purchase intention’’ (p = 0.07). Post hoc tests showed that personalized motion VTO is more useful than no-motion VTO (M N M = 4.75 vs. M P M = 5.83, p < 0.05). Regarding movement similarity to real motion, the personalized-motion VTO scored higher than the predefined-motion VTO (M P D M = 4.36 vs. M P M = 5.67, p < 0.01). We did not find significant differences among the three conditions in garment visualization, enjoyment, body similarity, and purchase intention.
  • Qualitative differences
    The qualitative study comments from each of the participants are listed in Table 7. The participants evaluated the three systems from four aspects: garment quality, virtual body similarity, enjoyment, and personalized motion. They thought the interactivity with the garment should be improved and the material of the garments was not realistic. The personalized virtual avatar provided by our system was similar to users, especially the personalized motion offering users a better sense of fitting on the “real me”, which makes the virtual avatar more realistic. As for enjoyment, some participants mentioned that the personalized motion makes them more interested in changing motions while fitting.
    We also asked for participants’ preferences for each condition at the end of study. Among the 12 female participants, 10 of them preferred the personalized motion condition, since it offered user a better sense of “real me”, and made the virtual avatar movement more realistic. In addition, personalized motion can help users gain a sense of wearing the clothes on their own body with different motions in their daily life. Two participants preferred the predefined motion condition; they thought that the personalized motion was not as smooth as the predefined motion, especially for walking animation. If the personalized motion can be smoother and more natural, they will choose the personalized-motion VTO as their preferred condition.

5.5. Testing of H3

We hypothesized that AR-based try-on with personalized avatar movement would produce a positive effect on user’s attitudes toward the apparel product and shopping intentions. To validate H3, we designed user study 2 to explore personalized motion and concerns about the VTO experience. In this experiment, we measured six items to assess user experience with VTO. We found significant differences (p < 0.01) in the movement similarity between the predefined and personalized motion conditions. We used motion capture technology to capture user movement, which offers a better sense of the “real me”, which made the virtual avatar’s movement more realistic. However, we found no direct effect of personalized motion on the attitude toward clothes and purchase intention, so H3 was not supported. According to the feedback of the participants, one of the main reasons that influenced user attitude toward the clothes was the quality of the virtual garment, as several participants responded that the material of the clothes was not realistic enough.

6. Discussion

In this study, we compared the effects of AR- and VR-based try-on (user study 1) and explored the effects of having personalized movement during the VTO experience (user study 2). The results from user study 1 revealed that AR-based try-on performed better than VR-based try-on, especially in the 3D representation of clothing and the fidelity of personalized human avatars. Using AR technology, the real interactive environment makes the fitness effect more realistic. In contrast, VR-based try-on provides several virtual environments for users, which can provide users a sense of wearing clothes in different conditions. However, users could be drawn to the inaccuracy of the avatar. These findings indicated that AR-based try-on shows more potential in the future of online shopping. The results from user study 2 indicated that users show a strong interest in the application of personalized motions in VTO technology. However, a high degree of personalized motion accuracy is required.

6.1. Theoretical Implications

This research provides three important contributions. First, the findings contribute to using personalized avatar for garment visualization in e-commerce. Currently, users often use a personal computer or mobile device to shop online. Using these platforms, the shopping experience is limited to 2D displays of product. These constraints make online shopping less convenient, especially when the resulting look and fit are important factors for these types of products. The main finding of this study is that a highly personalized human avatar is an important factor for realistically presenting online products and for simulating a better fit effect for the VTO experience. We found that AR-based try-on using personalized avatars can provide better 3D visualization of a product because AR can overlay virtual objects onto one’s view of the real environment. Having the 3D virtual product fit the personalized human avatar in the user’s real-life scene will improve the realism of garment visualization.
Second, our research enriches the literature on comparing the different roles of AR- and VR-based try-on. Previous research [71,72] investigated the impacts of AR/VR technology on the online shopping experience, and reported mixed results, but overlooked the differences in the impacts of the VTO experience between AR- and VR-based try-on. Previously, AR-based try-on has been reported to result in better and more realistic garment visualization than VR-based try-on, since AR-based environments include the real-life scenes, making the fitting effect more realistic than VR-based try-on. In addition, since AR-based try-on can be achieved on a smartphone, users can thoroughly view the 3D garment from different angles, thus providing users with a brand new product presentation experience. In this research, we found that the benefits of AR-based try-on are not always obvious. VR-based try-on with different wearing virtual scenes can allow users to try on clothes based on different virtual scenes, and may inspire some users when designing an outfit. Overall, AR performance is better.
Third, previous research proved that the virtual try-on experience with a personalized virtual avatar can create a positive attitude toward shopping technology and increase purchase intention [73,74,75]. Our research enriches the literature on the impact of personalized motion during the virtual try-on experience. Although we found that virtual try-on with personalized motion did not create a better attitude toward purchase intention and garment product, we found two potential influences of personalized motion on virtual try-on experience. Firstly, personalized motion provides a better sense of “true fit” due to the high similarity between user and avatar movements. As discussed in previous research, the success of traditional product presentation with a virtual avatar depends on similarity of the appearance of the virtual avatar to the real user [10,75,76]. We proposed another aspect of avatar similarity, in addition to the appearance of the avatar and the body shape, we also include the movement of users. Secondly, virtual try-on with personalized motion may offer consumers a helpful and realistic experience. Personalized motion provides users with a more engaging interaction with a virtual product and avatar, and offers a more enjoyable experience to communication-sensitive users when shopping online [77]. Overall, this research offers a better understanding of the impact of personalized motion during the virtual try-on experience. Compared with previous research, we proposed a highly personalized virtual try-on system that is different from the four virtual try-on systems mentioned by Merle et al. [10]. We have opened up a new direction for personalized virtual try-on systems, and have increased the possibilities for increasing online shopping enjoyment, and enhancing the virtual fitting experience for users.

6.2. Practical Implications

Our findings offer new insights for e-commerce, especially for fashion items. For companies, our results suggest that AR-based try-on systems on mobile phone can be easily used by customers, which can significantly increase customer shopping enjoyment and decrease the risk of returning products. Using AR or VR technology, companies can display a more detailed product using 3D visualization, which allows users to view the fashion products from different directions and angles. Retailers may focus on the presentation of 3D garment models on virtual human models in real-life scenes using AR technology.
For customers, our systems can provide customers a better understanding of how the product will look on them. Based on some 3D modeling technology, users can easily create their virtual avatar on the basis of the image from smartphone cameras (such as 3DLOOK). These kind of 3D avatar generation technique will increase consumer convenience when examining products before purchasing. The try-on experience using a personalized virtual avatar provides customers a sense of “how it would look” so that they can better judge whether clothes will be suitable for them, thereby increasing consumer satisfaction with online shopping.

6.3. Limitations and Future Research

This research has limitations that could be addressed in the future studies. Some limitations are related to the motion capture. In the current stage of research, we used Kinect to record users’ personalized motions. Since the detection of human bones by Kinect sensor is limited and the human body joint tracking is not accurate, the resulting personalized human motion is not as smooth as the predefined motions. Therefore, future research may use more advanced motion capture equipment to improve the accuracy of human motion. Another limitation is related to the avatar generation technique. At the current stage, we used two tools to generate a 3D body model and face model based on users’ images separately, thus we integrate these two model together. This kind of method need some manual adjustment. Therefore, future work may focus on how to generate a complete and precise virtual avatar based on the users images.
Our future research may focus on a fully personalized try-on experience. Garment fit on the personalized virtual avatar may help consumers ascertain fit, thus improving their confidence in purchase choice. We expect to use a virtual avatar that has body motions or even facial expressions in future VTO. With facial expression detection technology, a virtual avatar may have human-like expression, thus improving the fidelity of human avatar.
Finally, as VR- and AR-based try-on systems have different positive effects on the online shopping experience, further research may find a new way of combining VR and AR technology in VTO experience, which may help build a fully immersive and interactive interface. AR/VR technology allows for the possibility of creating a fully real/virtual environment. The fully immersive experience using AR/VR provides users a unique and innovative experience. The results from user study 1 showed that AR produces a better effect on VTO than VR. However, the result also showed that the virtual environment in VR-based try-on gives users more immersive shopping experience, thus inspiring users in terms of outfit design based on different wearing conditions. Therefore, further research may combine the advantages of AR and VR technology and discover a new method that may improve different aspects of the online shopping experience and overall customer satisfaction. For instance, using AR-based try-on, some virtual components in the real scene could be combined to simulate a variety of different virtual scenes, and then help users to try on clothes, based on different wearing conditions.

Author Contributions

Conceptualization, K.C., S.M., J.T., Y.L. (Yuzhao Liu), Y.L. (Yuhan Liu), and S.X.; Methodology, Y.L. (Yuzhao Liu) and Y.L. (Yuhan Liu); Writing—original draft preparation, Y.L. (Yuzhao Liu) and Y.L. (Yuhan Liu); Writing—review and editing, K.C., S.M., J.T., and S.X.; Supervision, J.T.; Project administration, K.C., S.M., and J.T. All authors have read and agreed to the published version of the manuscript.


This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.


The following abbreviations are used in this paper:
VTOVirtual Try-On
VRVirtual Reality
ARAugmented Reality
MoCapMotion Capture
NMNo Motion VTO Condition
PMPersonalized Motion VTO Condition
PDMPre-defined Motion VTO Condition
VR-TOVR-based try-on
AR-TOAR-based try-on


  1. Martin, C.G.; Oruklu, E. Human friendly interface design for virtual fitting room applications on android based mobile devices. J. Signal Inf. Process. 2012, 3, 481. [Google Scholar]
  2. Beck, M.; Crié, D. I virtually try it...I want it! Virtual Fitting Room: A tool to increase on-line and off-line exploratory behavior, patronage and purchase intentions. J. Retail. Consum. Serv. 2018, 40, 279–286. [Google Scholar] [CrossRef]
  3. Shin, S.J.H.; Chang, H.J.J. An examination of body size discrepancy for female college students wanting to be fashion models. Int. J. Fash. Des. Technol. Educ. 2018, 11, 53–62. [Google Scholar] [CrossRef]
  4. Cases, A.S. Perceived risk and risk-reduction strategies in Internet shopping. Int. Rev. Retail Distrib. Consum. Res. 2002, 375–394. [Google Scholar] [CrossRef]
  5. Rosa, J.A.; Garbarino, E.C.; Malter, A.J. Keeping the body in mind: The influence of body esteem and body boundary aberration on consumer beliefs and purchase intentions. J. Consum. Psychol. 2006, 16, 79–91. [Google Scholar] [CrossRef]
  6. Blázquez, M. Fashion shopping in multichannel retail: The role of technology in enhancing the customer experience. Int. J. Electron. Commer. 2014, 18, 97–116. [Google Scholar] [CrossRef][Green Version]
  7. Gao, Y.; Petersson Brooks, E.; Brooks, A.L. The Performance of Self in the Context of Shopping in a Virtual Dressing Room System. In HCI in Business; Nah, F.F.H., Ed.; Springer International Publishing: Cham, Switzerland, 2014; pp. 307–315. [Google Scholar]
  8. Kim, D.E.; LaBat, K. Consumer experience in using 3D virtual garment simulation technology. J. Text. Inst. 2013, 104, 819–829. [Google Scholar] [CrossRef]
  9. Lau, K.W.; Lee, P.Y. The role of stereoscopic 3D virtual reality in fashion advertising and consumer learning. In Advances in Advertising Research (Vol. VI); Springer: Berlin/Heidelberg, Germany, 2016; pp. 75–83. [Google Scholar]
  10. Merle, A.; Senecal, S.; St-Onge, A. Whether and how virtual try-on influences consumer responses to an apparel web site. Int. J. Electron. Commer. 2012, 16, 41–64. [Google Scholar] [CrossRef]
  11. Bitter, G.; Corral, A. The pedagogical potential of augmented reality apps. Int. J. Eng. Sci. Invent. 2014, 3, 13–17. [Google Scholar]
  12. Liu, F.; Shu, P.; Jin, H.; Ding, L.; Yu, J.; Niu, D.; Li, B. Gearing resource-poor mobile devices with powerful clouds: Architectures, challenges, and applications. IEEE Wirel. Commun. 2013, 20, 14–22. [Google Scholar]
  13. Wexelblat, A. Virtual Reality: Applications and Explorations; Academic Press: Cambridge, MA, USA, 2014. [Google Scholar]
  14. Yaoyuneyong, G.; Foster, J.; Johnson, E.; Johnson, D. Augmented reality marketing: Consumer preferences and attitudes toward hypermedia print ads. J. Interact. Advert. 2016, 16, 16–30. [Google Scholar] [CrossRef]
  15. Parekh, P.; Patel, S.; Patel, N.; Shah, M. Systematic review and meta-analysis of augmented reality in medicine, retail, and games. Vis. Comput. Ind. Biomed. Art 2020, 3, 1–20. [Google Scholar] [CrossRef] [PubMed]
  16. Javornik, A.; Rogers, Y.; Moutinho, A.M.; Freeman, R. Revealing the shopper experience of using a “magic mirror” augmented reality make-up application. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, Brisbane, Australia, 4–8 June 2016; Association For Computing Machinery (ACM): New York, NY, USA, 2016; Volume 2016, pp. 871–882. [Google Scholar]
  17. Lee, H.; Leonas, K. Consumer experiences, the key to survive in an omni-channel environment: Use of virtual technology. J. Text. Appar. Technol. Manag. 2018, 10, 1–23. [Google Scholar]
  18. De França, A.C.P.; Soares, M.M. Review of Virtual Reality Technology: An Ergonomic Approach and Current Challenges. In Advances in Ergonomics in Design; Rebelo, F., Soares, M., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 52–61. [Google Scholar]
  19. Han, X.; Wu, Z.; Wu, Z.; Yu, R.; Davis, L.S. Viton: An image-based virtual try-on network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 7543–7552. [Google Scholar]
  20. Sekine, M.; Sugita, K.; Perbet, F.; Stenger, B.; Nishiyama, M. Virtual fitting by single-shot body shape estimation. In Proceedings of the International Conference on 3D Body Scanning Technologies, Lugano, Switzerland, 21–22 October 2014; pp. 406–413. [Google Scholar]
  21. Decaudin, P.; Julius, D.; Wither, J.; Boissieux, L.; Sheffer, A.; Cani, M.P. Virtual garments: A fully geometric approach for clothing design. In Computer Graphics Forum; Wiley Online Library: Hoboken, NJ, USA, 2006; Volume 25, pp. 625–634. [Google Scholar]
  22. Hilsmann, A.; Eisert, P. Tracking and retexturing cloth for real-time virtual clothing applications. In Proceedings of the International Conference on Computer Vision/Computer Graphics Collaboration Techniques and Applications, Rocquencourt, France, 4–6 May 2009; Springer: Berlin/Heidelberg, Germany, 2009; pp. 94–105. [Google Scholar]
  23. Yamada, H.; Hirose, M.; Kanamori, Y.; Mitani, J.; Fukui, Y. Image-based virtual fitting system with garment image reshaping. In Proceedings of the 2014 International Conference on Cyberworlds, Santander, Spain, 6–8 October 2014; pp. 47–54. [Google Scholar]
  24. Chen, X.; Zhou, B.; Lu, F.X.; Wang, L.; Bi, L.; Tan, P. Garment modeling with a depth camera. ACM Trans. Graph. 2015, 34, 1–12. [Google Scholar] [CrossRef]
  25. Duan, L.; Yueqi, Z.; Ge, W.; Pengpeng, H. Automatic three-dimensional-scanned garment fitting based on virtual tailoring and geometric sewing. J. Eng. Fibers Fabr. 2019, 14. [Google Scholar] [CrossRef]
  26. Zhou, B.; Chen, X.; Fu, Q.; Guo, K.; Tan, P. Garment modeling from a single image. In Computer Graphics Forum; Wiley Online Library: Hoboken, NJ, USA, 2013; Volume 32, pp. 85–91. [Google Scholar]
  27. Yang, S.; Ambert, T.; Pan, Z.; Wang, K.; Yu, L.; Berg, T.; Lin, M.C. Detailed garment recovery from a single-view image. arXiv 2016, arXiv:1608.01250. [Google Scholar]
  28. Cheng, H.; Sanda, S. Application of Virtual Reality Technology in Garment Industry. DEStech Transactions on Social Science, Education and Human Science. In Proceedings of the 2017 3rd International Conference on Social Science and Management (ICSSM 2017), Xi’an, China, 8–9 April 2017. [Google Scholar] [CrossRef][Green Version]
  29. Li, R.; Zou, K.; Xu, X.; Li, Y.; Li, Z. Research of interactive 3D virtual fitting room on web environment. In Proceedings of the 2011 Fourth International Symposium on Computational Intelligence and Design, Hangzhou, China, 28–30 October 2011; Volume 1, pp. 32–35. [Google Scholar]
  30. Liu, X.H.; Wu, Y.W. A 3D display system for cloth online virtual fitting room. In Proceedings of the 2009 WRI World Congress on Computer Science and Information Engineering, Los Angeles, CA, USA, 31 March–2 April 2009; Volume 7, pp. 14–18. [Google Scholar]
  31. Richins, M.L. Social comparison and the idealized images of advertising. J. Consum. Res. 1991, 18, 71–83. [Google Scholar] [CrossRef]
  32. Fiore, A.M.; Jin, H.J. Influence of image interactivity on approach responses towards an online retailer. Internet Res. 2003, 38–48. [Google Scholar] [CrossRef]
  33. Fiore, A.M.; Jin, H.J.; Kim, J. For fun and profit: Hedonic value from image interactivity and responses toward an online store. Psychol. Mark. 2005, 22, 669–694. [Google Scholar] [CrossRef]
  34. Yuan, M.; Khan, I.R.; Farbiz, F.; Yao, S.; Niswar, A.; Foo, M.H. A mixed reality virtual clothes try-on system. IEEE Trans. Multimed. 2013, 15, 1958–1968. [Google Scholar] [CrossRef]
  35. Magnenat-Thalmann, N.; Kevelham, B.; Volino, P.; Kasap, M.; Lyard, E. 3d web-based virtual try on of physically simulated clothes. Comput. Aided Des. Appl. 2011, 8, 163–174. [Google Scholar] [CrossRef]
  36. Yang, S.; Xiong, G. Try It On! Contingency Effects of Virtual Fitting Rooms. J. Manag. Inf. Syst. 2019, 36, 789–822. [Google Scholar] [CrossRef]
  37. Thaler, A.; Piryankova, I.; Stefanucci, J.K.; Pujades, S.; de La Rosa, S.; Streuber, S.; Romero, J.; Black, M.J.; Mohler, B.J. Visual perception and evaluation of photo-realistic self-avatars from 3D body scans in males and Females. Front. ICT 2018, 5, 18. [Google Scholar] [CrossRef]
  38. Kim, J.; Forsythe, S. Adoption of virtual try-on technology for online apparel shopping. J. Interact. Mark. 2008, 22, 45–59. [Google Scholar] [CrossRef]
  39. Meng, Y.; Mok, P.; Jin, X. Interactive virtual try-on clothing design systems. Comput. Aided Des. 2010, 42, 310–321. [Google Scholar] [CrossRef]
  40. Warehouse. Available online: (accessed on 21 November 2019).
  41. Cai, S.; Xu, Y. Designing not just for pleasure: Effects of web site aesthetics on consumer shopping value. Int. J. Electron. Commer. 2011, 15, 159–188. [Google Scholar] [CrossRef]
  42. Häubl, G.; Trifts, V. Consumer decision making in online shopping environments: The effects of interactive decision aids. Mark. Sci. 2000, 19, 4–21. [Google Scholar] [CrossRef]
  43. Senecal, S.; Nantel, J. The influence of online product recommendations on consumers’ online choices. J. Retail. 2004, 80, 159–169. [Google Scholar] [CrossRef]
  44. Tam, K.Y.; Ho, S.Y. Understanding the impact of web personalization on user information processing and decision outcomes. MIS Q. 2006, 30, 865–890. [Google Scholar] [CrossRef]
  45. Lee, C.K.C.; Fernandez, N.; Martin, B.A. Using self-referencing to explain the effectiveness of ethnic minority models in advertising. Int. J. Advert. 2002, 21, 367–379. [Google Scholar] [CrossRef]
  46. D’Alessandro, S.; Chitty, B. Real or relevant beauty? Body shape and endorser effects on brand attitude and body image. Psychol. Mark. 2011, 28, 843–878. [Google Scholar] [CrossRef]
  47. Boonbrahm, P.; Kaewrat, C.; Boonbrahm, S. Realistic simulation in virtual fitting room using physical properties of fabrics. Procedia Comput. Sci. 2015, 75, 12–16. [Google Scholar] [CrossRef][Green Version]
  48. Javornik, A. ‘It’s an illusion, but it looks real!’Consumer affective, cognitive and behavioural responses to augmented reality applications. J. Mark. Manag. 2016, 32, 987–1011. [Google Scholar] [CrossRef]
  49. Body Data Platform, 3D LOOK. Available online: (accessed on 3 May 2020).
  50. Realistic 3D Avatars for Game, AR and VR, Avatar SDK. Available online: (accessed on 15 April 2020).
  51. Gültepe, U.; Güdükbay, U. Real-time virtual fitting with body measurement and motion smoothing. Comput. Graph. 2014, 43, 31–43. [Google Scholar] [CrossRef]
  52. Adikari, S.B.; Ganegoda, N.C.; Meegama, R.G.; Wanniarachchi, I.L. Applicability of a Single Depth Sensor in Real-Time 3D Clothes Simulation: Augmented Reality Virtual Dressing Room Using Kinect Sensor. Adv. Hum. Comput. Interact. 2020, 2020. [Google Scholar] [CrossRef]
  53. KINECT. Available online: (accessed on 14 April 2020).
  54. Maya. Available online: (accessed on 10 May 2020).
  55. 3ds MAX. Available online: (accessed on 20 April 2020).
  56. Marvelous Designer. Available online: (accessed on 13 March 2020).
  57. H&M. Available online: (accessed on 13 October 2019).
  58. ZARA. Available online: (accessed on 15 October 2019).
  59. Unity Game Enginee. Available online: (accessed on 1 September 2019).
  60. CinemaMocap. Available online: (accessed on 18 April 2020).
  61. Mixamo. Available online: (accessed on 25 November 2019).
  62. Suh, K.S.; Kim, H.; Suh, E.K. What if your avatar looks like you? Dual-congruity perspectives for avatar use. MIs Q. 2011, 35, 711–729. [Google Scholar] [CrossRef][Green Version]
  63. Rohm, A.J.; Kashyap, V.; Brashear, T.G.; Milne, G.R. The use of online marketplaces for competitive advantage: A Latin American perspective. J. Bus. Ind. Mark. 2004, 19, 372–385. [Google Scholar] [CrossRef]
  64. Mäenpää, K.; Kale, S.H.; Kuusela, H.; Mesiranta, N. Consumer perceptions of Internet banking in Finland: The moderating role of familiarity. J. Retail. Consum. Serv. 2008, 15, 266–276. [Google Scholar] [CrossRef][Green Version]
  65. Childers, T.L.; Carr, C.L.; Peck, J.; Carson, S. Hedonic and utilitarian motivations for online retail shopping behavior. J. Retail. 2001, 77, 511–535. [Google Scholar] [CrossRef]
  66. Chandran, S.; Morwitz, V.G. Effects of participative pricing on consumers’ cognitions and actions: A goal theoretic perspective. J. Consum. Res. 2005, 32, 249–259. [Google Scholar] [CrossRef]
  67. Chen, L.-d.; Gillenson, M.L.; Sherrell, D.L. Enticing online consumers: An extended technology acceptance perspective. Inf. Manag. 2002, 39, 705–719. [Google Scholar]
  68. Davis, F.D. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef][Green Version]
  69. SPSS. Available online: (accessed on 3 July 2020).
  70. Banakou, D.; Slater, M. Body ownership causes illusory self-attribution of speaking and influences subsequent real speaking. Proc. Natl. Acad. Sci. USA 2014, 111, 17678–17683. [Google Scholar] [CrossRef] [PubMed][Green Version]
  71. Bonetti, F.; Warnaby, G.; Quinn, L. Augmented reality and virtual reality in physical and online retailing: A review, synthesis and research agenda. In Augmented Reality and Virtual Reality; Springer: Berlin/Heidelberg, Germany, 2018; pp. 119–132. [Google Scholar]
  72. Zhang, J. A Systematic Review of the Use of Augmented Reality (AR) and Virtual Reality (VR) in Online Retailing. Ph.D. Thesis, Auckland University of Technology, Auckland, New Zealand, 2020. [Google Scholar]
  73. Chevalier, C.; Lichtlé, M.C. The Influence of the Perceived Age of the Model Shown in an Ad on the Effectiveness of Advertising. Rech. Appl. Mark. Engl. Ed. 2012, 27, 3–19. [Google Scholar] [CrossRef]
  74. Keh, H.T.; Park, I.H.; Kelly, S.; Du, X. The effects of model size and race on Chinese consumers’ reactions: A social comparison perspective. Psychol. Mark. 2016, 33, 177–194. [Google Scholar] [CrossRef]
  75. Plotkina, D.; Saurel, H. Me or just like me? The role of virtual try-on and physical appearance in apparel M-retailing. J. Retail. Consum. Serv. 2019, 51, 362–377. [Google Scholar] [CrossRef]
  76. Shim, S.I.; Lee, Y. Consumer’s perceived risk reduction by 3D virtual model. Int. J. Retail Distrib. Manag. 2011, 39, 945–959. [Google Scholar] [CrossRef]
  77. Akiyama, G.; Hsieh, R. Karte Garden. In Proceedings of the Virtual Reality International Conference-Laval Virtual, Laval, France, 4–6 April 2018; pp. 1–3. [Google Scholar]
Figure 1. The relationship between physical try-on and virtual try-on (VTO).
Figure 1. The relationship between physical try-on and virtual try-on (VTO).
Electronics 09 01814 g001
Figure 2. Our system uses three elements as inputs: a single full-front face image and two full body photos (front and side), then we generated a personalized avatar for each user.
Figure 2. Our system uses three elements as inputs: a single full-front face image and two full body photos (front and side), then we generated a personalized avatar for each user.
Electronics 09 01814 g002
Figure 3. Work flow of recording personalized user motion.
Figure 3. Work flow of recording personalized user motion.
Electronics 09 01814 g003
Figure 4. Some 3D garment model templates provided to users (left). Some textured 3D garment models (right).
Figure 4. Some 3D garment model templates provided to users (left). Some textured 3D garment models (right).
Electronics 09 01814 g004
Figure 5. 3D virtual try-on interface: VR-based try-on (left) and AR-based try-on (right).
Figure 5. 3D virtual try-on interface: VR-based try-on (left) and AR-based try-on (right).
Electronics 09 01814 g005
Figure 6. Selectable virtual scenes: (a) supermarket, (b) street, and (c) office.
Figure 6. Selectable virtual scenes: (a) supermarket, (b) street, and (c) office.
Electronics 09 01814 g006
Figure 7. Framework of motion capture.
Figure 7. Framework of motion capture.
Electronics 09 01814 g007
Figure 8. Mapping web garment image to generated 3D garment templates.
Figure 8. Mapping web garment image to generated 3D garment templates.
Electronics 09 01814 g008
Figure 9. Perceived levels of enjoyment, usefulness, worry about fit problem, convenience, purchase intention, garment visualization, and attitude to shopping technology. Note: * p < 0.05, ** p < 0.01, *** p < 0.001.
Figure 9. Perceived levels of enjoyment, usefulness, worry about fit problem, convenience, purchase intention, garment visualization, and attitude to shopping technology. Note: * p < 0.05, ** p < 0.01, *** p < 0.001.
Electronics 09 01814 g009
Figure 10. Perceived levels of body similarity, usefulness, garment visualization, purchase intention, and movement similarity. Note: * p < 0.05, ** p < 0.01.
Figure 10. Perceived levels of body similarity, usefulness, garment visualization, purchase intention, and movement similarity. Note: * p < 0.05, ** p < 0.01.
Electronics 09 01814 g010
Table 1. Questionnaire and measurement items.
Table 1. Questionnaire and measurement items.
Measurement ItemsQuestions
EnjoymentShopping with this system
was enjoyable for me.
Convenience in examining
the product
I gain a sense of how the outfit
might look for various occasions.
Garment visualizationHaving a model in a virtual environment real environment helps
me understand more about the appearance of the garments.
Worry about fit problemI feel worried that the clothes I choose may unsuitable for me.
UsefulnessThis shopping system would enhance
the effectiveness of the shopping experience.
Purchase intentionIt is very likely that I would purchase this product.
Attitude toward the
shopping technology
I want to use this system when I buy clothes online in the future.
Table 2. Post-hoc between-group results.
Table 2. Post-hoc between-group results.
Dependent VariableCondition (A)Mean (A)Condition (B)Mean (B)Significance of the
Mean Difference
(A vs. B)
Perceived EnjoymentImage-Only4.17 (1.115)VR-based try-on5.25 (1.138)p < 0.050
AR-based try-on6.00 (0.853)p < 0.001
VR-based try-on5.25 (1.138)AR-based try-on6.00 (0.853)ns
UsefulnessImage-only3.83 (1.267)VR-based try-on5.08 (0.669)p < 0.010
AR-based try-on5.83 (0.577)p < 0.001
VR-based try-on5.08 ( 0.669)AR-based try-on5.83 (0.577)ns
Worry about fit problemImage-only5.58 (0.996)VR-based try-on3.58 (1.165)p < 0.001
AR-based try-on3.00 (0.953)p < 0.001
VR-based try-on3.58 (1.165)AR-based try-on3.00 (0.953)ns
Purchase intentionImage-only4.25 (1.422)VR-based try-on4.75 (0.866)ns
AR-based try-on5.42 (0.793)p < 0.050
VR-based try-on4.75 (0.866)AR-based try-on5.42 (0.793)ns
ConvenienceImage-only3.17 (1.267)VR-based try-on5.75 (0.866)p < 0.001
AR-based try-on5.50 (0.798)p < 0.001
VR-based try-on5.75 (0.866)AR-based try-on5.50 (0.798)ns
Table 3. Illustrative excerpts and main conclusions of free comments mentioning AR-based try-on.
Table 3. Illustrative excerpts and main conclusions of free comments mentioning AR-based try-on.
CategoriesConclusionsFree Comments
AR-based try-onHighly realistic“Light and environment in AR-based try-on are more realistic than VR-based try-on.” “AR-based real environment is closer to real-life scene, making the fitting effect more real.”
High accuracy of human avatar“The virtual human model in VR-based try-on does not look like me.” “I can pay more attention to the sense of whole body in AR-based try-on.”
Better 3D garment visualization“I can see the effect close up or from a distance in AR-based try-on. It is very convenient for checking the details of the garments.” “I can obtain a view by looking thoroughly and examining the garments in details from different angles.”
High interactivity“AR-based try-on superimposed on the real world. I can interact with my model in the real world background, right next to me.” “It’s more interesting to interact with my own virtual avatars in a real-life scene.” “It is a game-like experience, I can put my virtual body into real life and interact with it. It is very interesting for me.”
Table 4. Illustrative excerpts and main conclusions of free comment mentioning the VR-based try-on.
Table 4. Illustrative excerpts and main conclusions of free comment mentioning the VR-based try-on.
CategoriesConclusionsFree Comments
VR-based try-onHigher attractiveness“The different wearing conditions give me some
inspiration of how to design an outfit. It is very
interesting.” “AR-based try-on looks better, I also
like the changing environment function in VR-based
try-on. So, If they can be combined, it will be better.”
Low accuracy of
human avatar
“I think the virtual human model in VR-based
try-on does not look like me, so when I am
fitting clothes in VR, it is not realistic.”
Single perspective and
lower interactivity
“I can view virtual human models and clothes models in
360 degrees in AR-based try-on. The perspective of VR-based
try-on is too simple. I cannot view the models from the view
points that I wanted.”
Table 5. Questionnaire and measurement items.
Table 5. Questionnaire and measurement items.
Measurement Items of User Study 2
Body similarityI feel that the virtual body I saw was my own body.
Movement similarityI feel that the movement of the virtual avatar was similar to my own movements.
UsefulnessI can imagine what it looks like when I am wearing clothes by performing some activities in real life.
Garment visualizationModel walking in a real environment helps me know more about the appearance of the clothes.
EnjoymentSeeing my own model in the real world makes me feel interested.
Purchase intentionThe probability that I would buy the product is very high.
Table 6. Post hoc between-group results.
Table 6. Post hoc between-group results.
Dependent VariableCondition (A)Mean (A)Condition (B)Mean (B)Significance of the
Means Difference
(A vs. B)
Body similarityno-motion5.58 (0.793)predefined motion5.75 (0.965)ns
personalized motion5.67 (0.778)ns
predefined motion5.25 (1.138)personalized motion6.00 (0.853)ns
Usefulnessno-motion4.75 (1.138)predefined motion5.25 (0.622)ns
personalized motion5.83 (0.835)p < 0.05
predefined motion5.25 (0.622)personalized motion5.83 (0.835)ns
Garment visualizationno motion5.50 (0.905)predefined motion5.58 (0.669)ns
personalized motion5.67 (0.778)ns
pre-defined motion5.58 (0.669)personalized motion5.67 (0.778)ns
Enjoymentno motion6.00 (1.128)pre-defined motion6.08 (0.669)ns
personalized motion6.25 (0.622)ns
pre-defined motion6.08 (0.669)personalized motion6.25 (0.622)ns
Purchase intentionno motion5.08 (0.793)predefined motion5.50 (0.522)ns
personalized motion5.67 (0.492)ns
predefined motion5.50 (0.522)personalized motion5.67 (0.492)ns
Table 7. Illustrative excerpts and main conclusions in free comments.
Table 7. Illustrative excerpts and main conclusions in free comments.
CategoriesConclusionsFree Comments
Garment qualityThe material of clothes is not realistic.“The garment looks unrealistic.” “The material of the clothes does not look real.”
Body similarityVirtual avatar with personalized motion makes the virtual avatar more similar with users.“The virtual avatar looks like me.” “The face of the virtual avatar is very similar to me.” “The virtual avatar is just another me.”
EnjoymentAR-based VTO with personalized motion is enjoyable for users.“The personalized motion condition makes me more interested in changing my avatar’s motion.” “Personalized motion is realistic, making me feel engaged.”
Personalized motionAlthough the personalized motion is not as smooth as pre-defined motion, it is more similar to users’ own movements.“The personalized motion looks more natural and realistic.” “Compared with predefined motion, personalized motion offers a sense of the real me, which makes the virtual avatar’s movement more realistic.” “The predefined motion is standardized, which does not look similar to me.” “The personalized motion is not smooth, I could gain a better understanding with a smoother motion.” “With the personalized motion, I can also check the shape of clothes when the model is moving.” “The AR-based VTO with personalized motion is closest to me. I feel like I am looking into a mirror.” “It would be nice if the virtual avatar can have the user’s facial expression as well.”
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Liu, Y.; Liu, Y.; Xu, S.; Cheng, K.; Masuko, S.; Tanaka, J. Comparing VR- and AR-Based Try-On Systems Using Personalized Avatars. Electronics 2020, 9, 1814.

AMA Style

Liu Y, Liu Y, Xu S, Cheng K, Masuko S, Tanaka J. Comparing VR- and AR-Based Try-On Systems Using Personalized Avatars. Electronics. 2020; 9(11):1814.

Chicago/Turabian Style

Liu, Yuzhao, Yuhan Liu, Shihui Xu, Kelvin Cheng, Soh Masuko, and Jiro Tanaka. 2020. "Comparing VR- and AR-Based Try-On Systems Using Personalized Avatars" Electronics 9, no. 11: 1814.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop