Next Article in Journal
Experiencing Authenticity of the House Museums in Hybrid Environments
Next Article in Special Issue
Keep the Human in the Loop: Arguments for Human Assistance in the Synthesis of Simulation Data for Robot Training
Previous Article in Journal
Exploring the Educational Value and Impact of Vision-Impairment Simulations on Sympathy and Empathy with XREye
Previous Article in Special Issue
A Literature Survey of How to Convey Transparency in Co-Located Human–Robot Interaction
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Would You Hold My Hand? Exploring External Observers’ Perception of Artificial Hands

1
Institute for Informatics, LMU Munich, Frauenlobstr. 7a, 80337 Munich, Germany
2
School of Computation, Information and Technology and Munich Institute of Robotics and Machine Intelligence (MIRMI), Technical University of Munich (TUM), Boltzmannstrasse 3, 85748 Garching, Germany
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2023, 7(7), 71; https://doi.org/10.3390/mti7070071
Submission received: 1 June 2023 / Revised: 6 July 2023 / Accepted: 12 July 2023 / Published: 17 July 2023
(This article belongs to the Special Issue Challenges in Human-Centered Robotics)

Abstract

:
Recent technological advances have enabled the development of sophisticated prosthetic hands, which can help their users to compensate lost motor functions. While research and development has mostly addressed the functional requirements and needs of users of these prostheses, their broader societal perception (e.g., by external observers not affected by limb loss themselves) has not yet been thoroughly explored. To fill this gap, we investigated how the physical design of artificial hands influences the perception by external observers. First, we conducted an online study (n = 42) to explore the emotional response of observers toward three different types of artificial hands. Then, we conducted a lab study (n = 14) to examine the influence of design factors and depth of interaction on perceived trust and usability. Our findings indicate that some design factors directly impact the trust individuals place in the system’s capabilities. Furthermore, engaging in deeper physical interactions leads to a more profound understanding of the underlying technology. Thus, our study shows the crucial role of the design features and interaction in shaping the emotions around, trust in, and perceived usability of artificial hands. These factors ultimately impact the overall perception of prosthetic systems and, hence, the acceptance of these technologies in society.

1. Introduction

The field of human–robot interaction (HRI) investigates the dynamics and the way humans interact with robots in different scenarios; for example, an individual would have different attitudes, perceptions, and goals when interacting with a social robot such as NAO [1] than when interacting with an industrial robot like PR2 [2] or KUKA [3]. Goetz et al. [4] argue that the overall design of a given robot influences the interaction dynamics, leading to distinct initial assumptions concerning the capabilities of social robots as opposed to industrial systems. Similarly, the appearance of the robot, as well as the observer’s prior experiences [5,6], shape their understanding of the system; different design features and materials can inform the observer about robot capabilities and how humans can interact with it [4,7,8,9]. Furthermore, the expectations that a human forms based on the robot’s aesthetics are sustained in time. Paetzel et al. [10] found that first impressions of social robots are persistent, i.e., they affect the long-term perception of such robots.
Robots are embodied agents [11] that may have haptic interactions with human interaction partners, e.g., via hands. An interesting field of application is upper limb prostheses, as users physically interact with the external world and other people mostly through these artificial limbs. They can help the user to recover lost motor functions [12,13], but they still fall short in restoring the social dimension. For instance, the reported rejection rate of upper limb prostheses ranges between 20% [14] and 44% [15] among users. As design features shape human expectations of robots [16,17], and artificial hands are classified as rehabilitation robots [18], one can assume that the design features of artificial hands have an effect on the perception of external observers (i.e., society), and, thus, affect technology acceptance. To our knowledge, the perception of artificial hands has not been investigated in terms of trust and usability, and its relation to design features.
In this paper, we conduct a mixed-method study on attribute-based robot-related factors, i.e., the design of and interaction with the robot. Specifically, we focus on emotional response, trust, and perceived usability as key dimensions of our analysis. In a two-phase approach, we ran an online survey with 42 participants and a lab study with 14 participants, where participants either observed or interacted with three different types of artificial hands, selected as a representation of the state of the art. In this initial exploration, we explore the impact of the artificial hands as objects that evoke potential emotional responses by themselves. Thus, we focus on the inherent response to artificial hands in a controlled environment. With this study, we aim to answer the research question: How do the (i) design of artificial hands and (ii) interaction depth impact external observers’ emotional response and their perception regarding usability and trust?
Our study reveals that the design of artificial hands significantly affects their emotional valence and perceived efficiency, as well as the stimulation and novelty of the participants’ experience. We discuss how the design factors (i.e., level of compliance, aesthetic appearance, and design complexity), affect the perceived usability of artificial hands. Additionally, the design influences the trust that humans place in the capabilities of the hand, but does not have a significant effect on moral-related aspects of trust. In-depth interactions with active movements lead to more understanding of the system but show a limited impact on the general user experience. By outlining factors affecting the perceived trust and usability of current artificial hands, this work gives valuable insights and identifies areas for improvement to increase the acceptance of upper limb prostheses.

2. Background

Artificial hands are used in a wide range of applications, ranging from prosthetics to assist people who have lost part of their limbs to the integration with robots in the field of HRI [19]. Among rehabilitation robotics, bionic limbs are robotic systems that are directly attached to the human body to compensate for a missing body part [18]. Most hand prostheses aim to recreate the appearance and functionality of missing human hands. The social acceptance of prosthetic devices remains a pressing issue that hinders their seamless assimilation and widespread recognition [20]. Furthermore, artificial limbs can affect the perception that external observers have of the users and lead to stereotypes. For instance, prostheses users are stereotyped to be more competent than people with limb loss who do not use prostheses [21]. However, prostheses users are still perceived as less competent than an able-bodied person [21]. Thus, prosthesis users may reject an artificial limb due to cultural and social implications, especially in developing countries [20].
Conventional design approaches favor a simplistic architecture (i.e., using a single motor) and the use of a cosmetic glove that replicates the appearance of the human hand and users’ skin color. Recent trends are gravitating towards more machine-oriented aesthetics, and some even allow each finger to be controlled individually. Piazza et al. [19] observed a growing interest in the scientific community towards flexible solutions and soft robotic technologies, which draw inspiration from the anatomical structure and functionalities exhibited by the human hand [19]. Modern artificial hands can be classified by their level of compliance, i.e., rigid and soft prostheses [22]. Soft artificial hands are made of elastic materials and thus are theorized to pose less immediate risk for close physical interaction. Jørgensen et al. [23] conducted a comparative analysis between the perceptual aspects of soft robotics and conventional rigid robots. Although no statistically significant variance was observed in the quantitative appeal of the two robot types, the qualitative results indicate that the interaction differs depending on the robot material. Furthermore, the stiffness of prosthetic hands affects the grasping behavior, which further impacts the perceived human-likeness and the comfort of external observers [24].
The field of human–robot interaction has explored different factors that affect the perception of robotic systems. Trust is one of the central research areas [25], as it determines the willingness of users to interact with a system [26]. The factors that influence trust are divided into robot-related, human-related, and environment-related variables [27,28]. Robot-related factors, e.g., appearance or level of automation, further influence the trust that humans have in a robotic interaction partner [27,28]. The characteristics of the robot, i.e., its performance and attributes, have the largest effect on trust in HRI, surpassing the impact of human-related and environment-related factors [27]. In the absence of trust, individuals may be reluctant to interact with the system [26,29]. Thus, one of the central aims of HRI research is to facilitate an appropriate level of trust in robots [25].
Trust in robots has characteristics from both human–machine and human–human interaction [30]. The robot performance is one of the best predictors of trust [27]. Studies in HRI have investigated the impact of robot errors [31,32] and transparency [33] on trust. This factor has also received attention in prosthetics research: Abd et al. [34] found that simulated system malfunctions negatively impact trust and satisfaction, and increase frustration with the prosthetic arm [34]. Jabban et al. [35] evaluated the perception of primary users of upper limb prostheses and the relation to sensory feedback sources. They found that trust is fundamental for prolonged usage (i.e., not abandoning the prosthetic limb).

3. Method

This work explores the perception of artificial hands at first-time interaction in a mixed-method user study. First, we distributed an online survey on the emotional response, permitting only visual interactions with the systems. Thereafter, we conducted a study in a lab setting, where participants physically interacted with different types of artificial hands (see Figure 1). We considered two independent variables: the design of the handand the depth of the interaction.
We selected three artificial hands (see Figure 2) as a representation of the state-of-the-art of commercial and research devices: a hook-style prosthesis (VariPlus Speed, Ottobock, Duderstadt, Germany) [36], a multigrip anthropomorphic prosthesis (iLimb-Ultra, Ossur, Reykjavik, Iceland) [37], and a soft robotic hand (qb SoftHand, QBRobotics, Navacchio, Italy) [38]. The systems present multiple differences in their design features. The VariPlus Speedis a simple two-degrees-of-freedom (DoFs) gripper with rigid properties. It consists of one motor that commands the closure of the index, middle, and thumb fingers with a linkage-based actuation mechanism. The glove includes a passive ring and little finger that follow the movement of the index–middle fingers and support grasping. The iLimb-Ultra is a multigrip prosthetic hand that includes five individually powered digits and offers an electrically rotating thumb with a manual override for the execution of multiple grasp patterns. Even though this design presents two phalanges per finger, its components are also rigid with a linkage-based actuation mechanism. Finally, the qb SoftHand is a highly underactuated robotic hand with soft properties for industrial applications. It includes a total of 19 DoFs, with three phalanges per finger and only two at the thumb, actuated with only one motor and a tendon-driven mechanism. Elastic bands connect the different phalanges and provide the hand with two functionalities: reopening the hand when the motor is deactivated and deforming the fingers’ configuration when contact forces are applied in order to perform multiple grasp patterns and adapt to the object shape. Accordingly, these artificial hands are characterized by three main design factors: (i) level of compliance, (ii) aesthetic appearance, and (iii) design complexity. An overview for each device included in the study is presented in Table 1.

3.1. Online Survey

We explored the impact of the independent variable design on the emotional response in an online survey. The questionnaire displayed images of all three investigated prostheses in random order. After looking at the picture, the participants rated their emotional response to each design using the self-assessment manikin (SAM) scale [39]. This technique is used to capture the affective reactions of participants to different stimuli. All pictures are rated in terms of valence, arousal, and dominance on a 9-point Likert scale. SAM further uses accompanying images to clarify the meaning of these three dimensions to people filling out the questionnaire. The survey took 5 min to complete, was not compensated, and the participation was voluntary.

Participants

A group of 42 participants, aged 18–48, M = 24.76 (SD = 5.13) (18 female, 22 male) filled out the questionnaire. In order to consider the potential influence of cultural differences, we categorized the participants into broad cultural groups based on [40]. A total of 73.8% of participants were European, 16.7% were Asian, 4.8% were African, and 2.4% were American. Half of the participants (50.0%) had interacted with artificial hands 1–10 times, while 40.5% of them had never interacted with any robotic hand before. Only 9.5% of participants stated that they interact with artificial hands often.

3.2. Study: Interacting with Artificial Hands

In the second study, we explored two independent variables, design and interaction depth, during physical interactions of external users with artificial hands. The interaction depth is either simple physical touch or interaction while the artificial hand is moving. In total, all participants interacted with each of the three artificial hands twice, which leads to a 2 × 3 within-subject study design. The order of the different prosthetic hands was counterbalanced to minimize order effects.

Procedure

First, the experimenter handed the participants the artificial hands one by one. The participants had the opportunity to touch each prosthetic hand and explore how it feels for one minute. After the static interactions, the participants interacted with the three artificial hands while in action for one minute. The experimenter controlled the closing and opening movements of each artificial hand. In this way, participants could experience how each artificial hand moves and feel if the interaction changes. After each interaction, the participant filled out a questionnaire to rate their perception of the specific design. For this purpose, the User Experience Questionnaire (UEQ) [41] and the Multi-Dimensional Conception and Measure of Human–Robot Trust (MDMT) [30] were used. The UEQ is a validated questionnaire to assess six usability subfactors. The questionnaire uses 26 semantic differentials, e.g., ”annoying—enjoyable”, “clear—confusing” to capture participants’ impressions on a 7-point scale. It has been applied in research on technology acceptance [42,43,44] and human–robot interaction [45,46].
The MDMT is a novel trust measure that differentiates between trust in the performance of a technology and trust in the moral integrity of a technology. In this validated questionnaire, participants rate a robotic entity using trust-related words like “genuine”, “skilled”, and “respectable” on a 7-point Likert scale. Each of the trust dimensions is represented by an equal number of trust-related words. The MDMT is primarily in use in human–robot interaction, e.g., [47,48]. To conclude the study, we conducted a short, semistructured, audio-recorded interview to provide participants with the option to vocalize any thoughts the questionnaires did not assess. The whole study took approximately 40 min to complete.

Participants

We recruited 14 participants, aged 18–31, M = 24.8, SD = 3.6 (3 female, 11 male). All participants had a technological background, with 7 studying engineering and 7 studying computer science. A total of 50.0% of participants were European, 35.7% were Asian, and 14.3% were North African or Middle Eastern. The participation was voluntary and participants were not compensated.

4. Analysis and Results

We present the analysis of the quantitative data collected in the online survey and during the physical interaction, focusing on the perception of participants of the artificial hands. First, we analyze the emotional response to images of artificial hands from the online survey. We then present the quantitative data on user experience and trust in the artificial hands during the physical interaction. We conclude our results with a qualitative analysis of the feedback collected during the supplementary interviews.

4.1. Analysis of Emotional Response

We compared the emotional response to the different design features in terms of valence, arousal, and dominance using the SAM scale (see Figure 3). We analyzed the emotional responses using one-way ANOVA and found that the emotional valence is significantly different between the different designs (p < 0.05). While the iLimb-Ultra had an average valence rating of 5.93 and the qb SoftHand of 6.24, the VariPlus Speed had a valence of 5.17. We ran a post hoc Tukey test to see which hand presented significance inside of valence and found that the VariPlus Speed and qb SoftHand induced significantly different emotional responses (p < 0.05). No significant difference could be shown in terms of arousal and dominance across different artificial hands (p > 0.05).
Moreover, we calculated the Pearson correlation coefficient to find correlations between the emotional responses and demographic data. Age had a positive correlation with ratings of arousal for the iLimb-Ultra (0.38) and the qb SoftHand (0.29); and the ratings of dominance for the qb SoftHand (0.36) and the VariPlus Speed (0.28). This means that, for the listed designs, older participants experienced more arousal and a more positive valence. Gender and technological background had no noteworthy impact on the perception of artificial hands. Prior experience with artificial hands had a positive correlation with the valence of the qb SoftHand (0.36).

4.2. Analysis of User Experience

We also evaluated the impact of the prosthetic design on user experience following the original handbook of the User Experience Questionnaire (UEQ) [41] refer to http://www.ueq-online.org/ (accessed on 18 April 2023) for detailed information on the questionnaire). We used the six subscales of the UEQ as dependent variables in the statistical analysis (see Figure 4). As the order of positive and negative items varied in the questionnaire, we transformed the data using the tools provided in the handbook. For each artificial hand, we compared the perceived usability depending on interaction depth, i.e., the type of the first physical interaction. Interaction depth had a significant impact on the perspicuity of the VariPlus Speed prosthetic hand (p < 0.05), but not on any experience aspect of the iLimb-Ultra and the qb Softhand.
According to the Shapiro–Wilk test, data are not normally distributed. We thus used the Kruskal–Wallis H-test to compare the three designs of artificial hands. The results show that prosthesis design has a significant impact (p < 0.05) on the user experience in terms of efficiency and novelty during the initial interaction, and stimulation and novelty during the in-depth interaction. When significance was detected, we followed up with a post hoc Tukey test for pairwise comparisons. The results show that there is no significant difference in user experience between the iLimb-Ultra and the qb SoftHand (p > 0.05) across usability subscales. The perceived efficiency during the first interaction with the VariPlus Speed was significantly lower than with the iLimb-Ultra (p < 0.05). The novelty during both interactions and the stimulation during the second interaction of the VariPlus Speed were different from both the qb SoftHand (p < 0.05) and the iLimb-Ultra (p < 0.05). This result is in line with qualitative feedback, which we will discuss in Section 4.4.

4.3. Analysis of Trust

The Multi-Dimensional Conception and Measure of Human–Robot Trust (MDMT) [30] was used to assess trust in the three different artificial hands involved. According to the multiple dimensions of MDMT, we differentiate between performance trust, which includes how reliable and capable the artificial hand seems, and moral trust, which includes how ethical and sincere the artificial hand is perceived to be. On average, trust was rated at 4.63 (SD = 1.42) on a 7-point Likert scale.
We calculated one-way ANOVA using Python to find out if there was a difference in trust depending on the design. There is a significant trust difference in the perceived capability of the different artificial hands (p < 0.05) (see Figure 5). We ran a post hoc Tukey test and found out that the VariPlus Speed is significantly different regarding trust in its capabilities from both the iLimb-Ultra and the qb SoftHand in both interactions (p < 0.05).
There was no significant difference in the other dimensions of trust between different designs (p > 0.05). Further, there was no significant difference in trust between the first interaction and the second interaction.

4.4. Analysis of Qualitative Feedback

We transcribed the post-experiment interviews and used open coding to cluster the participants’ responses. This resulted in the following themes regarding the perception of the different artificial hands: (1) Initial expectations of the capabilities of artificial hands are higher than their actual capabilities. (2) Aesthetics influence perception. (3) Interaction depth influences preferences. (4) Robotic hands do not have to mimic human hands.

4.4.1. Initial Expectations

Based on the initial interaction, participants had high expectations of the capabilities and abilities of artificial hands (see Section 4.3). Half of the participants confirmed and emphasized this in their qualitative statements regarding all devices. P3 said “I expected more of [the interactions]. It was a lot of simple motions. Just open and close”. Participants primarily focus on the dexterity of the artificial hands, e.g., “I expected more. For example, that the individual fingers would move.” (P11) and “I expected more degrees of freedom” (P7).
Compared to the other two artificial hands, the iLimb-Ultra has the most sophisticated design, with parts of the inner tendons exposed. This additional visual information helped participants to understand the capabilities and behavior of the artificial hand. P9 detailed “I could [perceive] what was happening. While this one, the qb Softhand, in this one I can’t tell what is happening. It feels a little random”. Similarly, P7 said “For the hook style, I couldn’t really see the inside [...] I couldn’t really expect what movements are possible”.

4.4.2. Aesthetics Influence Perception

The aesthetic and general look of the artificial hands had a big impact on the perception and attitudes toward them. Thus, when given the opportunity to comment on their experience, most participants mentioned, e.g., “I think the design [of the iLimb-Ultra] is very interesting and very robotic, very futuristic” (P1).
Most participants further commented on the contact experience with the artificial hands in terms of texture and level of compliance, e.g., “I thought [the qb SoftHand] would be smoother” (P6). P8 further noted that the shape of the VariPlus Speed felt unnatural to them since the fingers were proportionally shorter than those of human hands. The joints of the qb SoftHand are more flexible, which was perceived well by participants. The Softhand joints “felt more natural” (P4) and were “relatively soft”(P2). P3 commented that “it was better that you could bend the fingers. So it was not as stiff as the other ones”.
Most participants commented on the naturalness of prosthetic hands, as natural interactions are desired. P1, for example, thought that covering the qb SoftHand with a material akin to a gardening glove was appealing: “When I look at it, I don’t see anything mechanical or something unnatural or something different than the initial hand with the glove”. The VariPlus Speed has the most human-like design and is thus “[the] most natural from optics” (P5). However, half of the participants mentioned feeling uncomfortable with this design, probably due to the uncanny valley effect, a phenomenon where artificial elements designed to be human-like are easily recognized as nonhuman and thus cause unease [49]. P6 and P14 describe this phenomenon: “I think the [VariPlus Speed] is a bit creepy for me because it looks like a human hand, but on the other hand it doesn’t.” (P6); “This is something that is too close to humans. So it’s kind of a little bit terrible.” (P14). Preferences for the iLimb-Ultra and the qb SoftHand were split among participants, while nobody favored the VariPlus Speed. The primary reason for favoring one artificial hand design over another were aesthetics (n = 9) and observed capabilities (n = 5).

4.4.3. Interaction Depth Influences Preferences

During the last interaction session, participants experienced the artificial hands in movement. Feeling the hands and their movement impacted the participants’ opinions. Participants (P1, P4, P6, P11) commented on movement speed, e.g., “[The speed of the hands was] much faster than I expected” (P1). During the movement, the prosthetic hands are all rather loud, which P4, P9, and P11 perceived negatively. Several participants (P5, P11, P13) mentioned the strength of the artificial hand and noted that the stronger the artificial hand was, the less safe they felt during the interaction. P11 explicitly pointed out that safety was only a concern for the second interaction when the artificial hands were moving.

4.4.4. Robotic Hands Do Not Have to Mimic Humans

Finally, we asked participants how their perception would differ if the artificial hands were used as an end-effector for a robot. While half of the participants did not think their perception would be different, the other half detailed differences in design and capabilities. P3 for example found that for simple actions the two scenarios would be similar, i.e., “It would depend on the degree to which the robot can control the hands. If it was just one sequence (open-close) then it would be the same”. As there would be no human in control of the hand to intercept when safety issues arise, the robot would need to be more capable and intelligent.
In terms of design, participants found the iLimb-Ultra more appropriate due to its robotic appearance, while the VariPlus Speed was perceived as worse. P12 said that “either you completely try to mimic a human or you don’t” in support of a more technical design for robotic end-effectors. Meanwhile “the hook-style on the attachment of a humanoid robot would look something closer to, let’s say, a horror movie” (P13).

5. Discussion

In this study, we aimed to investigate the influence of interaction depth and design features on the perception of prosthetic hands among external observers. Throughout the study, three distinct artificial hands were employed, each characterized by different design features encompassing various factors: (i) level of compliance, (ii) aesthetic appearance, and (iii) design complexity (i.e., polyarticulated or not). While the quantitative data solely allow the comparison among the hands included, qualitative feedback sheds light on design factors that impact perception. The results of our mixed-method study have implications for the future design of artificial hands, particularly in terms of establishing appropriate expectations from the perspective of external observers (bystanders).

5.1. Interaction Depth

In-depth interactions increase the understanding of what artificial hands are capable of. The movement has been used in HRI to increase the transparency of robots [50]. In line with this, participants in this study had a better understanding (i.e., perspicuity) of the functionality and capabilities of the VariPlus Speed after observing the system and its movement (see Figure 4b). While similar trends existed for other hands, the difference was not significant. Note that perspicuity is higher for simpler artificial hands (i.e., Variplus Speed) from the beginning. Based on the qualitative feedback, this seems to result from the simpler movements and capabilities, which are easier to comprehend. This means that the design of VariPlus Speed itself clearly communicated the actual capabilities already, probably as these are limited compared to the other hands tested. According to this result, the capabilities of artificial hands should be reflected and clearly visible in the design architecture. We can further observe that novelty was lower during the last interaction for all artificial hand designs (see Figure 4f).
There was no significant difference in trust for different interaction depths. However, results from the MDMT show a trend that some designs could experience differences according to the interaction depth. For instance, the perceived reliability subscale (see Figure 5a) shows that the added interaction depth did not change the trust in the iLimb-Ultra, but made a difference for the VariPlus Speed. Qualitative feedback suggests that the speed and the strength of the system movement can impact trust. An artificial hand that has the capability of causing physical harm through movement is approached with more caution. Thus, future research can explore providing feedback on the movement properties to the user and to external observers.

5.2. Design of Artificial Hands

The presented study compares two rigid artificial hands (VariPlus Speed and iLimb-Ultra) and a soft one (qb SoftHand). Based on Jørgensen et al. [23], we expected different usability and trust outcomes according to their level of compliance. However, the soft qb SoftHand and rigid iLimb-Ultra were perceived very similarly, while the rigid VariPlus Speed was singled out both in quantitative and qualitative results. Note that although the qb SoftHand and iLimb-Ultra present different rigidness properties, both of them are polyarticulated systems. While participants did favorably mention the soft joints of the qb SoftHand as a distinguishing feature, the immediate perception was primarily based on stylistic design choices. Thus, while the rigidness of artificial hands is a differentiating factor for prosthesis users [24], it does not necessarily affect the immediate perception of external observers.
In terms of emotional response, only the emotional valence is affected by the design of artificial hands (see Figure 3a). We found that the humanlike VariPlus Speed had a less positive valence, i.e., was perceived as less pleasant, while the other artificial hands were perceived similarly. The attractiveness subscale (UEQ) upon physical interaction yields similar results (see Figure 4a). These results indicate that participants had lower expectations of the capabilities due to the nontechnical appearance of the artificial hand. However, based on the qualitative feedback, a more likely interpretation is that the participants had a negative reaction towards the VariPlus Speed hand due to the uncanny valley effect [49]. Although anthropomorphism can have positive effects in HRI, e.g., on perceived competence [51], designers should refrain from too-humanlike artificial hands to avoid adverse reactions.
In contrast, designers can make use of the technical nature of artificial hands and allow insights into mechanical processes. The mechanical design of the iLimb-Ultra allows for the direct observation of the underlying mechanical mechanisms behind the hand movements, which was received well by external observers. While this reflects the opinion of external observers, primary users may want to disguise the technical nature of their hands to avoid social stigma [20]. Among the received feedback, the noise of the motor was noticed by participants of this study and could be reduced to provide a more pleasant experience.
According to our findings, the design of artificial hands impacts user experience in terms of efficiency, novelty, and stimulation (see Section 4.2). Further, trust in the functionality of artificial hands varies depending on their design, while trust in their moral integrity remains unaffected by design (see Section 4.3). This suggests that the design of artificial hands communicates more information related to their performance capabilities. While moral trust did not play a role in the scenario selected by us, we can not exclude that moral trust is impacted by artificial hands in other scenarios and contexts. Qualitative feedback suggests that external observers base their trust on performance metrics, as a human operates the artificial hand in place of a robot. Future work could explore the impact of the operator, i.e., human and robotic agents, on moral trust, for other applications. Our study revealed that the actual capabilities of artificial hands do not always align with the expected or perceived capabilities, which can have negative consequences for the user’s experience and safety. Kosch et al. [52] demonstrated that novel technologies can create a placebo effect in users, and Villa et al. [53] showed that this effect also applies to embodied devices used to assist with tasks. Additionally, research has shown that biased expectations of system functionalities can lead to higher risk decisions, even when the system is not functional. Prostheses should thus be designed to preserve coherent expectations of their functionality. For instance, the VariPlus Speed may be less likely to induce biased judgments of capabilities and, consequently, be less likely to lead to risky user behavior.

5.3. Limitations

In this study, we focused on external observers, limiting our experiments to able-bodied participants. While the perception external observers have of prosthetic hands is interesting to optimize social acceptance, it is essential to remember that primary users hold the utmost priority. Future studies could revisit the emotional perception and trust primary users develop in artificial hands, contingent upon their design characteristics.
It is worth emphasizing that our participants were young and predominantly had a technical background, which affects the generalizability of our results to a broader population. Thus, the understanding of artificial hands was influenced by their technical understanding of technologies, especially as artificial hands were regarded as objects in this study. We suspect that people with a less technical background would have more extreme opinions on prosthetic hands: either they could overestimate the capabilities due to limited awareness of technical limitations; or they could show higher distrust and aversion due to less familiarity. While this study provides initial insights into the social impact of artificial hand design, future studies are needed to explore the differences in perception of the general population. A suitable next step from this study is a field study that explores the performance trust and perceived usability of artificial hands in use by real prostheses users.
Our findings show the need for meticulous design consideration in artificial hands, with particular emphasis on material selection and human likeness level. Nonetheless, the present analysis does not offer a comprehensive understanding of the relative significance of individual design aspects, as this was beyond the scope of our study. Moreover, our study only included three different models of artificial hands, albeit carefully chosen to encompass a broad spectrum of design trends. Future research could explore additional designs, and explore design aspects further by conducting detailed comparative investigations, controlling for confounding variables, such as exclusively examining diverse materials.

6. Conclusions

The presented study explores the perception of artificial hands based on design features and interaction depth in a mixed-method user study. This study aimed to investigate the initial emotional response to artificial hands as systems in a controlled environment. Our findings indicate that design has an effect on the emotional response, user experience, and trust in the capabilities of artificial hands. Perceived trust and emotional valence were negatively affected by the humanlike appearance of the artificial hands tested. Contrary to our expectations, participants showed higher levels of trust in artificial hands that present a more technological design. Interaction depth and additional information through mechanical design increase the perceived usability in terms of perspicuity, but other aspects of perception remain largely unaffected. Thus, it is imperative for designers to contemplate the congruence between the appearance of artificial hands and their functional capacities. From a more general perspective, improving bystander trust in artificial hands could lead to broader acceptance of prosthetic devices. However, the usability and trust perception of primary users will continue to take precedence.

Author Contributions

Conceptualization, S.Y.S., P.C.-M. and C.P.; methodology, S.Y.S., P.C.-M. and C.P.; formal analysis, S.Y.S.; investigation, S.Y.S.; resources, P.C.-M. and C.P.; data curation, S.Y.S.; writing—original draft preparation, S.Y.S.; writing—review and editing, S.Y.S., P.C.-M., S.V., C.P. and A.B.; visualization, S.Y.S.; supervision, P.C.-M., C.P. and A.B.; project administration, S.Y.S.; funding acquisition, C.P. and A.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research project/publication was supported by LMUexcellent and TUM AGENDA 2030, funded by the Federal Ministry of Education and Research (BMBF) and the Free State of Bavaria under the Excellence Strategy of the Federal Government and the Länder as well as by the Hightech Agenda Bavaria.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Commission of the Technical University of Munich, reference number 478/21 S-SR (27 September 2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy reasons. The audio recordings were deleted after transcription for privacy reasons.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Robaczewski, A.; Bouchard, J.; Bouchard, K.; Gaboury, S. Socially assistive robots: The specific case of the NAO. Int. J. Soc. Robot. 2021, 13, 795–831. [Google Scholar] [CrossRef]
  2. Kwon, M.; Huang, S.H.; Dragan, A.D. Expressing Robot Incapability. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI ‘18), Chicago, IL, USA, 5–8 March 2018; Association for Computing Machinery: New York, NY, USA, 2018; pp. 87–95. [Google Scholar] [CrossRef] [Green Version]
  3. Kopp, T.; Baumgartner, M.; Kinkel, S. Success factors for introducing industrial human-robot interaction in practice: An empirically driven framework. Int. J. Adv. Manuf. Technol. 2021, 112, 685–704. [Google Scholar] [CrossRef]
  4. Goetz, J.; Kiesler, S.; Powers, A. Matching robot appearance and behavior to tasks to improve human-robot cooperation. In Proceedings of the 12th IEEE International Workshop on Robot and Human Interactive Communication (ROMAN 2003), Millbrae, CA, USA, 2 November 2003; pp. 55–60. [Google Scholar]
  5. Ososky, S.; Philips, E.; Schuster, D.; Jentsch, F. A Picture is Worth a Thousand Mental Models: Evaluating Human Understanding of Robot Teammates. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2013, 57, 1298–1302. [Google Scholar] [CrossRef]
  6. Sanders, T.L.; MacArthur, K.; Volante, W.; Hancock, G.; MacGillivray, T.; Shugars, W.; Hancock, P. Trust and prior experience in human-robot interaction. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2017, 61, 1809–1813. [Google Scholar] [CrossRef] [Green Version]
  7. Belanche, D.; Casaló, L.V.; Schepers, J.; Flavián, C. Examining the effects of robots’ physical appearance, warmth, and competence in frontline services: The Humanness-Value-Loyalty model. Psychol. Mark. 2021, 38, 2357–2376. [Google Scholar] [CrossRef]
  8. Tay, B.; Jung, Y.; Park, T. When stereotypes meet robots: The double-edge sword of robot gender and personality in human–robot interaction. Comput. Hum. Behav. 2014, 38, 75–84. [Google Scholar] [CrossRef]
  9. Walters, M.L.; Koay, K.L.; Syrdal, D.S.; Dautenhahn, K.; Te Boekhorst, R. Preferences and perceptions of robot appearance and embodiment in human-robot interaction trials. In Proceedings of the New Frontiers in Human-Robot Interaction, Edinburgh, UK, 8–9 April 2009. [Google Scholar]
  10. Paetzel, M.; Perugia, G.; Castellano, G. The persistence of first impressions: The effect of repeated interactions on the perception of a social robot. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 73–82. [Google Scholar]
  11. Wainer, J.; Feil-seifer, D.J.; Shell, D.A.; Mataric, M.J. The role of physical embodiment in human-robot interaction. In Proceedings of the ROMAN 2006—The 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, UK, 6–8 September 2006; pp. 117–122. [Google Scholar] [CrossRef]
  12. Townsend, D.; MajidiRad, A. Trust in Human-Robot Interaction within Healthcare Services: A Review Study. In Proceedings of the International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, St. Louis, MO, USA, 14–17 August 2022; American Society of Mechanical Engineers: New York, NY, USA, 2022; Volume 7, p. V007T07A030. [Google Scholar]
  13. Xu, J.; Bryant, D.G.; Howard, A. Would You Trust a Robot Therapist? Validating the Equivalency of Trust in Human-Robot Healthcare Scenarios. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 442–447. [Google Scholar] [CrossRef]
  14. Biddiss, E.A.; Chau, T.T. Upper limb prosthesis use and abandonment: A survey of the last 25 years. Prosthet. Orthot. Int. 2007, 31, 236–257. [Google Scholar] [CrossRef]
  15. Salminger, S.; Stino, H.; Pichler, L.H.; Gstoettner, C.; Sturma, A.; Mayer, J.A.; Szivak, M.; Aszmann, O.C. Current rates of prosthetic usage in upper-limb amputees—Have innovations had an impact on device acceptance? Disabil. Rehabil. 2022, 44, 3708–3713. [Google Scholar] [CrossRef]
  16. Kwon, M.; Jung, M.F.; Knepper, R.A. Human expectations of social robots. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; pp. 463–464. [Google Scholar] [CrossRef]
  17. Schramm, L.T.; Dufault, D.; Young, J.E. Warning: This Robot is Not What It Seems! Exploring Expectation Discrepancy Resulting from Robot Design. In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (HRI ‘20), Cambridge, UK, 23–26 March 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 439–441. [Google Scholar] [CrossRef]
  18. Mohebbi, A. Human-robot interaction in rehabilitation and assistance: A review. Curr. Robot. Rep. 2020, 1, 131–144. [Google Scholar] [CrossRef]
  19. Piazza, C.; Grioli, G.; Catalano, M.; Bicchi, A. A century of robotic hands. Annu. Rev. Control Robot. Auton. Syst. 2019, 2, 1–32. [Google Scholar] [CrossRef]
  20. Arabian, A.; Varotsis, D.; McDonnell, C.; Meeks, E. Global social acceptance of prosthetic devices. In Proceedings of the 2016 IEEE Global Humanitarian Technology Conference (GHTC), Seattle, WA, USA, 13–16 October 2016; pp. 563–568. [Google Scholar] [CrossRef]
  21. Meyer, B.; Asbrock, F. Disabled or Cyborg? How Bionics Affect Stereotypes Toward People with Physical Disabilities. Front. Psychol. 2018, 9, 2251. [Google Scholar] [CrossRef]
  22. Capsi-Morales, P.; Piazza, C.; Catalano, M.G.; Grioli, G.; Schiavon, L.; Fiaschi, E.; Bicchi, A. Comparison between rigid and soft poly-articulated prosthetic hands in non-expert myo-electric users shows advantages of soft robotics. Sci. Rep. 2021, 11, 23952. [Google Scholar] [CrossRef]
  23. Jørgensen, J.; Bojesen, K.B.; Jochum, E. Is a soft robot more “natural”? Exploring the perception of soft robotics in human–robot interaction. Int. J. Soc. Robot. 2022, 14, 95–113. [Google Scholar] [CrossRef]
  24. Capsi-Morales, P.; Piazza, C.; Catalano, M.G.; Bicchi, A.; Grioli, G. Exploring Stiffness Modulation in Prosthetic Hands and Its Perceived Function in Manipulation and Social Interaction. Front. Neurorobot. 2020, 14, 33. [Google Scholar] [CrossRef]
  25. Hsu, C.; Tsao, C.C.; Weng, Y.L.; Tang, C.Y.; Chang, Y.W.; Kang, Y.; Chien, S.Y. A Machine Learning Approach to Model HRI Research Trends in 2010~2021. In Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction (HRI ‘22), Sapporo, Japan, 7–10 March 2022; pp. 812–815. [Google Scholar] [CrossRef]
  26. Theodorou, A.; Wortham, R.H.; Bryson, J.J. Designing and implementing transparency for real time inspection of autonomous robots. Connect. Sci. 2017, 29, 230–241. [Google Scholar] [CrossRef] [Green Version]
  27. Hancock, P.A.; Billings, D.R.; Schaefer, K.E.; Chen, J.Y.C.; de Visser, E.J.; Parasuraman, R. A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction. Hum. Factors 2011, 53, 517–527. [Google Scholar] [CrossRef]
  28. Schaefer, K.E. Measuring trust in human robot interactions: Development of the “trust perception scale-HRI”. In Robust Intelligence and Trust in Autonomous Systems; Springer: New York, NY, USA, 2016; pp. 191–218. [Google Scholar] [CrossRef]
  29. Lee, J.D. Review of a pivotal Human Factors article: “Humans and automation: Use, misuse, disuse, abuse”. Hum. Factors 2008, 50, 404–410. [Google Scholar] [CrossRef] [Green Version]
  30. Malle, B.F.; Ullman, D. A multidimensional conception and measure of human-robot trust. In Trust in Human-Robot Interaction; Elsevier: Amsterdam, The Netherlands, 2021; pp. 3–25. [Google Scholar]
  31. Nesset, B.; Rajendran, G.; Aguas Lopes, J.D.; Hastie, H. Sensitivity of Trust Scales in the Face of Errors. In Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction (HRI ‘22), Sapporo, Japan, 7–10 March 2022; pp. 950–954. [Google Scholar]
  32. Salem, M.; Lakatos, G.; Amirabdollahian, F.; Dautenhahn, K. Would You Trust a (Faulty) Robot?: Effects of Error, Task Type and Personality on Human-Robot Cooperation and Trust. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, Portland, OR, USA, 2–5 March 2015; pp. 141–148. [Google Scholar] [CrossRef] [Green Version]
  33. Hastie, H.; Liu, X.; Patron, P. Trust Triggers for Multimodal Command and Control Interfaces. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI ‘17), Glasgow, UK, 13–17 November 2017; Association for Computing Machinery: New York, NY, USA, 2017; pp. 261–268. [Google Scholar] [CrossRef]
  34. Abd, M.A.; Gonzalez, I.; Ades, C.; Nojoumian, M.; Engeberg, E.D. Simulated robotic device malfunctions resembling malicious cyberattacks impact human perception of trust, satisfaction, and frustration. Int. J. Adv. Robot. Syst. 2019, 16. [Google Scholar] [CrossRef] [Green Version]
  35. Jabban, L.; Metcalfe, B.W.; Raines, J.; Zhang, D.; Ainsworth, B. Experience of Adults with Upper-limb Difference and their Views on Sensory Feedback for Prostheses: A Mixed Methods Study. medRxiv 2022. [Google Scholar] [CrossRef]
  36. Ottobock. Available online: https://www.ottobock.com/ (accessed on 10 May 2023).
  37. Ossur. Available online: https://www.ossur.com/ (accessed on 10 May 2023).
  38. Capsi-Morales, P.; Piazza, C.; Grioli, G.; Bicchi, A.; Catalano, M.G. The SoftHand Pro platform: A flexible prosthesis with a user-centered approach. J. NeuroEng. Rehabil. 2023, 20, 20. [Google Scholar] [CrossRef]
  39. Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef] [PubMed]
  40. Australian Bureau of Statistics. Australian Standard Classification of Cultural and Ethnic Groups (ASCCEG). ABS Website. 2019. Available online: https://www.abs.gov.au/statistics/classifications/australian-standard-classification-cultural-and-ethnic-groups-ascceg/latest-release (accessed on 2 December 2022).
  41. Laugwitz, B.; Held, T.; Schrepp, M. Construction and evaluation of a user experience questionnaire. In HCI and Usability for Education and Work, Proceedings of the 4th Symposium of the Workgroup Human-Computer Interaction and Usability Engineering of the Austrian Computer Society, USAB 2008, Graz, Austria, 20–21 November 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 63–76. [Google Scholar]
  42. Mlekus, L.; Bentler, D.; Paruzel, A.; Kato-Beiderwieden, A.L.; Maier, G.W. How to raise technology acceptance: User experience characteristics as technology-inherent determinants. Gr. Interakt. Organ. Z. Angew. Organisationspsychol. 2020, 51, 273–283. [Google Scholar] [CrossRef]
  43. Fiore, D.; Baldauf, M.; Thiel, C. “Forgot Your Password Again?”: Acceptance and User Experience of a Chatbot for in-Company IT Support. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia (MUM ‘19), Pisa, Italy, 26–29 November 2019; Association for Computing Machinery: New York, NY, USA, 2019. [Google Scholar] [CrossRef]
  44. Manghisi, V.M.; Evangelista, A.; Semisa, D.; Latorre, V.; Uva, A.E. Evaluating the Acceptance of Cinematic virtual reality-based applications for rehabilitative interventions in Schizophrenia. Games Health J. 2022, 11, 385–392. [Google Scholar] [CrossRef] [PubMed]
  45. Chan, W.P.; Hanks, G.; Sakr, M.; Zhang, H.; Zuo, T.; Van der Loos, H.M.; Croft, E. Design and evaluation of an augmented reality head-mounted display interface for human robot teams collaborating in physically shared manufacturing tasks. ACM Trans. Hum.-Robot. Interact. 2022, 11, 1–19. [Google Scholar] [CrossRef]
  46. Neef, C.; Linden, K.; Richert, A. Exploring the Influencing Factors on User Experience in Robot-Assisted Health Monitoring Systems Combining Subjective and Objective Health Data. Appl. Sci. 2023, 13, 3537. [Google Scholar] [CrossRef]
  47. Banks, J. Good robots, bad robots: Morally valenced behavior effects on perceived mind, morality, and trust. Int. J. Soc. Robot. 2021, 13, 2021–2038. [Google Scholar] [CrossRef]
  48. Martinez, J.E.; VanLeeuwen, D.; Stringam, B.B.; Fraune, M.R. Hey?! What did you think about that Robot? Groups Polarize Users’ Acceptance and Trust of Food Delivery Robots. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, Stockholm, Sweden, 13–16 March 2023; pp. 417–427. [Google Scholar]
  49. Mori, M.; MacDorman, K.F.; Kageki, N. The Uncanny Valley [From the Field]. IEEE Robot. Autom. Mag. 2012, 19, 98–100. [Google Scholar] [CrossRef]
  50. Wallkötter, S.; Tulli, S.; Castellano, G.; Paiva, A.; Chetouani, M. Explainable Embodied Agents Through Social Cues: A Review. J. Hum.-Robot Interact. 2021, 10, 1–24. [Google Scholar] [CrossRef]
  51. Christoforakos, L.; Gallucci, A.; Surmava-Große, T.; Ullrich, D.; Diefenbach, S. Can Robots Earn Our Trust the Same Way Humans Do? A Systematic Exploration of Competence, Warmth, and Anthropomorphism as Determinants of Trust Development in HRI. Front. Robot. AI 2021, 8, 640444. [Google Scholar] [CrossRef]
  52. Kosch, T.; Welsch, R.; Chuang, L.; Schmidt, A. The Placebo Effect of Artificial Intelligence in Human-Computer Interaction. ACM Trans. Comput.-Hum. Interact. 2022, 29, 1–32. [Google Scholar] [CrossRef]
  53. Villa, S.; Kosch, T.; Grelka, F.; Schmidt, A.; Welsch, R. The Placebo Effect of Human Augmentation: Anticipating Cognitive Augmentation Increases Risk-Taking Behavior. Comput. Hum. Behav. 2023, 146, 107787. [Google Scholar] [CrossRef]
Figure 1. Overview of the protocol and assessments included in this study. Note that the same procedure is repeated for the three devices in a randomized order. SAM = self-assessment manikin, UEQ = User Experience Questionnaire; MDMT = Multi-Dimensional Conception and Measure of Human–Robot Trust.
Figure 1. Overview of the protocol and assessments included in this study. Note that the same procedure is repeated for the three devices in a randomized order. SAM = self-assessment manikin, UEQ = User Experience Questionnaire; MDMT = Multi-Dimensional Conception and Measure of Human–Robot Trust.
Mti 07 00071 g001
Figure 2. The different types of artificial hands used in the study: the qb SoftHand QBRobotics), the iLimb-Ultra (Ossur), and the VariPlus Speed (Ottobock).
Figure 2. The different types of artificial hands used in the study: the qb SoftHand QBRobotics), the iLimb-Ultra (Ossur), and the VariPlus Speed (Ottobock).
Mti 07 00071 g002
Figure 3. Results of the SAM scale for the online survey. The emotional response to static images of the three artificial hand designs iLimb-Ultra (red), qb SoftHand (blue), and VariPlus Speed (green) is evaluated on a 9-point Likert scale. The asterisk indicates a significant difference between pairs.
Figure 3. Results of the SAM scale for the online survey. The emotional response to static images of the three artificial hand designs iLimb-Ultra (red), qb SoftHand (blue), and VariPlus Speed (green) is evaluated on a 9-point Likert scale. The asterisk indicates a significant difference between pairs.
Mti 07 00071 g003
Figure 4. Results of the six dimensions of user experience measured with the UEQ. Raw data have been transformed to a range from 3 (negative user experience) to +3 (positive user experience). Each figure depicts the positive and negative attitudes toward the three artificial hands: iLimb-Ultra in red, the qb SoftHand in blue, and the VariPlus Speed in green. The first interaction is shown on the left and the second interaction on the right. The asterisk indicates a significant difference between pairs.
Figure 4. Results of the six dimensions of user experience measured with the UEQ. Raw data have been transformed to a range from 3 (negative user experience) to +3 (positive user experience). Each figure depicts the positive and negative attitudes toward the three artificial hands: iLimb-Ultra in red, the qb SoftHand in blue, and the VariPlus Speed in green. The first interaction is shown on the left and the second interaction on the right. The asterisk indicates a significant difference between pairs.
Mti 07 00071 g004
Figure 5. Results of the four dimensions of trust measured with the MDMTon a 7-point Likert scale for the three artificial hands: iLimb-Ultra (red), the qb SoftHand (blue), and the VariPlus Speed (green). Each figure displays the first interaction with an artificial hand on the left (shaded) and the second interaction on the right (plain). The asterisk indicates a significant difference between pairs.
Figure 5. Results of the four dimensions of trust measured with the MDMTon a 7-point Likert scale for the three artificial hands: iLimb-Ultra (red), the qb SoftHand (blue), and the VariPlus Speed (green). Each figure displays the first interaction with an artificial hand on the left (shaded) and the second interaction on the right (plain). The asterisk indicates a significant difference between pairs.
Mti 07 00071 g005
Table 1. Design factors characterizing each device included in this study.
Table 1. Design factors characterizing each device included in this study.
Level of ComplianceAesthetic AppearanceDesign Complexity
VariPlus SpeedRigidHumanlikeHook-like
iLimb-UltraRigidMachinelikePolyarticulated
qb SoftHandSoftMachinelikePolyarticulated
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Schött, S.Y.; Capsi-Morales, P.; Villa, S.; Butz, A.; Piazza, C. Would You Hold My Hand? Exploring External Observers’ Perception of Artificial Hands. Multimodal Technol. Interact. 2023, 7, 71. https://doi.org/10.3390/mti7070071

AMA Style

Schött SY, Capsi-Morales P, Villa S, Butz A, Piazza C. Would You Hold My Hand? Exploring External Observers’ Perception of Artificial Hands. Multimodal Technologies and Interaction. 2023; 7(7):71. https://doi.org/10.3390/mti7070071

Chicago/Turabian Style

Schött, Svenja Y., Patricia Capsi-Morales, Steeven Villa, Andreas Butz, and Cristina Piazza. 2023. "Would You Hold My Hand? Exploring External Observers’ Perception of Artificial Hands" Multimodal Technologies and Interaction 7, no. 7: 71. https://doi.org/10.3390/mti7070071

Article Metrics

Back to TopTop