Next Article in Journal
Embodied Engagement with Narrative: A Design Framework for Presenting Cultural Heritage Artifacts
Previous Article in Journal
Semantic Fusion for Natural Multimodal Interfaces using Concurrent Augmented Transition Networks
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Conveying Emotions by Touch to the Nao Robot: A User Experience Perspective

1
School of Informatics, University of Skövde, Box 408, 541 28 Skövde, Sweden
2
Department of Information Technology, Uppsala University, Box 256, 751 05 Uppsala, Sweden
3
Department of Applied IT, University of Gothenburg, Box 100, 405 30 Gothenburg, Sweden
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2018, 2(4), 82; https://doi.org/10.3390/mti2040082
Submission received: 6 November 2018 / Revised: 8 December 2018 / Accepted: 12 December 2018 / Published: 16 December 2018

Abstract

:
Social robots are expected gradually to be used by more and more people in a wider range of settings, domestic as well as professional. As a consequence, the features and quality requirements on human–robot interaction will increase, comprising possibilities to communicate emotions, establishing a positive user experience, e.g., using touch. In this paper, the focus is on depicting how humans, as the users of robots, experience tactile emotional communication with the Nao Robot, as well as identifying aspects affecting the experience and touch behavior. A qualitative investigation was conducted as part of a larger experiment. The major findings consist of 15 different aspects that vary along one or more dimensions and how those influence the four dimensions of user experience that are present in the study, as well as the different parts of touch behavior of conveying emotions.

1. Introduction

Robots have been entering the world of humans, in vocational as well as domestic setting, for quite a while. Also, social robots are anticipated to have an increasing significance in everyday life for a growing number of people [1,2]. It is therefore important to study how people want to interact with robots and what constitutes an intuitive, smooth, and natural interaction between humans and robots. For social robots, like in all other interactive systems, products and devices, positive user experience (UX) is necessary in order to achieve the intended benefits. The UX is not built into the product itself; instead, it is an outcome of the interaction that depends on the internal state of the user, the quality and attributes of the product, and the particular situation [3,4]. Hence, UX is difficult to define but easy to identify [5,6]. The UX of social robots needs to be a central issue of concern, because positive UX underpins the proliferation of social robots in society [7]. Accordingly, negative UX can result in reluctance to interact with robots and challenge the acceptance of future robotic technologies [8]. However, a positive UX does not appear by itself. It has to be systematically, thoroughly, and consciously designed for, not least in the interactions between humans and robots [1,3,5,9,10].
Social robots, designed to interact with human beings, need to act in relation to the social and emotional aspects of human life, and be able to sense and react to social cues. As interaction between humans and robots become more complex, there is an increased interest in developing robots with human-like features and qualities that enable interaction with humans in more intuitive, smooth, and meaningful ways [11,12]. Consequently, touch, as one of the most fundamental aspects of human social interaction [13], has started to receive interest in the field of human–robot interaction (HRI) [14,15]. It has been argued that enabling robots to “feel”, “understand”, and respond to touch in accordance with expectations of humans would enable a more intuitive and smooth interaction between humans and robots [15]. The fundamental role of touch for human bonding and social interaction emphasizes the likelihood that humans will seek to show affection by touching robots, particularly robots meant to be engaged socially in interacting with humans. Hertenstein et al. [16] reported results from a study of affective tactile communication between humans, demonstrating that tactile communication has similar information content as facial and vocal communication. We therefore see a growing need to consider touch as a natural part of the interaction between humans and social robots. However, in HRI, touch, and especially affective touch that communicates or evokes emotion, has received less attention than modalities such as vision and audio [15]. This stands in contrast to prior research showing that humans want to interact with robots through touch [17,18,19].
In order to support the design for a positive UX in interactions with social robots, we analyze how humans experience tactile interaction and the conveyance of emotions to the robot. This is important in order to enable a natural interaction that the robot can interpret and respond to in an appropriate manner. From an interaction design perspective, robots are unique in the sense that they constitute a digital artefact that does not only provide input and output devices allowing users to access digital content, but are able to directly feel and act upon the shared physical and social environment, with its users. Thus, we believe that studies on tactile HRI are critical as they target a modality that to a large degree has been absent in the study and design of other digital devices.
This paper is based on an experiment on affective tactile HRI, more precisely on how humans convey emotions via touch to a socially interactive humanoid robot, in this case the robot Nao [20]. As a foundation for the study, we have, as far as possible, replicated a human–human interaction (HHI) experiment conducted by Hertenstein et al. [16] so as to validate our work within the context of natural tactile interaction. In our previous findings, reported in Andreasson et al. [21], a strong similarity in how participants communicate emotions using touch, regardless of whether the receiver is another human or a humanoid robot is observed and as a whole the results provide support for a strong transfer of theories from human–human tactile interaction to the HRI domain, while there are also notable differences compared to the results obtained by Hertenstein et al. [16].
The purpose of the current part of the study is to shed light on how humans, as the users of robots, experience this specific kind of interaction, but also deepen the understanding regarding what aspects influence their experience, as well as the ways they convey emotions by touch. Therefore, the subjective perspective of the humans was examined in relation to the tactile interaction with the robot. This means that we not only try to understand how the interaction unfolds and identifying its patterns, but also finding out how users experience it and the underlying causes to the interaction as well as the experience.
The remainder of the paper is organized as follows. Section 2 introduces the concepts of HRI, socially interactive robots, user experience, human touch, and motivates their interrelatedness. Section 3 describes the methodology of the experiment. In Section 4, the analysis and results are reported. Section 5 provides a discussion of the research results and Section 6 outlines future work and ends the paper with some concluding remarks.

2. Background

This section gives an overview of related work and central areas of the paper. These are human–robot interaction [1], socially interactive robots [1], user experience [1], human touch, and the role and relevance of touch in social HRI [22].

2.1. Human–Robot Interaction

Robots are increasingly becoming a part of the human world. In some domains, not least in industrial settings, they have been an important and natural part for many years. They are also entering other settings, professional as well as domestic [2]. The purpose of robotic technology is to make it possible for a person to do something that they could not do earlier, facilitate a certain task, make it nicer, or provide entertainment [23]. Robots can bring different kinds of value [24], for example, by doing monotonous assembling tasks in manufacturing or cutting the lawn. In those cases, often humans do not need to continuously interact with the robot. Other types of robots and usage situations—e.g., assisting elderly people—demand more frequent and multi-faceted interaction. This interplay between robots and their users has to be carefully taken into account when developing a robot in order for it to be valuable [1].
The problem of understanding and designing the interactions between humans and robots is the core interest of the field of HRI [1,23]. More precisely, “HRI is the science of studying people’s behavior and attitudes towards robots in relationship to the physical, technological and interactive features of the robots, with the goal to develop robots that facilitate the emergence of human–robot interactions that are at the same time efficient (according to the original requirements of their envisaged area of use), but are also acceptable to people, and meet the social and emotional needs of their individual users as well as respecting human values” [25].
The importance of and the attention attracted to HRI is increasing concurrently with the growing amount of technological achievements in robotics. For the same reasons, the concept of a robot is constantly changing [25]. Robots can vary along multiple dimensions, e.g., the types of task it is intended to support, its morphology, interaction roles, human–robot physical proximity, and autonomy level [26]. Roughly speaking, robots can be categorized into industrial robots, professional service robots, and personal service robots [27]. Moreover, the role of humans in relation to robots can vary; the human can be a supervisor, operator, mechanic, teammate, bystander, mentor or information consumer [23,28]. Likewise, robots can have a wide range of manifestations and be used in different application areas. There are human-like robots (humanoids and androids), robots that look like animals, or mechanical-appearing robots. Robots can be used for urban search and rescue tasks, e.g., natural disasters and wilderness search; assistive and educational robotics, e.g., therapy for the elderly; military and police, e.g., patrol support; edutainment, e.g., museum tour guide; space, e.g., astronaut assistant; home, e.g., robotic companion; and industry, e.g., construction [1,23,25].
Consequently, the interaction between user and robot can occur in a wide variety, depending on user, task, and context-based conditions. Generally, interaction can be either remote, i.e., the human and the robot are spatially, and perhaps also temporarily, separated, or proximate, i.e., the human and the robot are collocated [23]. The interaction can be indirect, which means that the user operates the robot by commanding it, or direct, that is when the communication is bi-directional between the user and robot, and that the robot can act on its own [27]. Dautenhahn [11] argues that in many application areas the robots also need to have social skills, making them socially interactive robots [1].

2.2. Socially Interactive Robots

Socially interactive robots, or social robots for short, are “robots for which social interaction plays a key role” ([29], p. 145). Such robots should display social intelligence, meaning they demonstrate qualities that resemble human social expressions. Examples of such qualities are emotional appearance and perception, advanced dialogue capabilities, possibilities to recognize humans and other robots, as well as being able to make use of, for instance, gaze and gesture as part of communication [1,29].
It is not necessary for all robots to be highly socially interactive. Instead, the purpose and the context it is supposed to act in or upon sets the requirements for the extent of social skills of the robot. For instance, robots that are completely autonomous or remotely controlled, and work in isolation from human users may require few or even no social skills. A typical example is the industrial robot that for security reasons only interacts with operators when it is turned off. As more applications involve runtime interactions between human and robot, social skills become increasingly important. One example is collaborative robots in production or assembly, but there are also examples from agriculture and firefighting. Even higher demands on social skills are placed on robots used as tour guides, hotel assistants, or entertainers. Social skills are also essential for robots used in nursing and therapy, while a domestic robot companion must have vast social intelligence [11]. Dautenhahn [11] puts forth that the social skills required of robots vary along several dimensions, and, hence, it is important to be aware of the role and context for the robot to be used in. The dimensions listed by Dautenhahn [11] are contact with humans (from none and remote to repeated long-term physical contact), robot functionality (from limited and clearly defined to open and adaptive functionality that is shaped by learning), role of robot (from machine or tool to roles such as assistant, companion, and partner), as well as requirement of social skills (from not required or desirable to essential) [1].
HRI research concerning social interactions with robots can be categorized into three approaches [11]: robot-centered, human-centered, and robot cognition-centered HRI. Robot-centered HRI focuses on sociable robots where the robot is viewed as an autonomous creature and the human is some sort of caretaker of the robot that should identify and respond to its needs. Human-centered HRI, on the other hand, emphasizes the human, who should find the robot acceptable, pleasant, and able to fulfil its specified tasks. The core of robot-cognition HRI is to model and view robots as intelligent systems. In order to get robots to “inhabit our living environments”, there needs to be a synthesis of these three approaches ([11] p. 684). However, previously human-centered HRI has not received as much attention as the other two approaches [6]. Thus, beyond solving the technical challenges of each robot application, the robot needs to be carefully designed for a positive user experience, especially when it comes to socially interactive robots [1].

2.3. User Experience

User experience (UX) is a concept that is becoming increasingly important. Technology is spreading into almost every aspects of the daily life of humans. Furthermore, humans have been using advanced technology for quite a while and, therefore, their expectations of and demands on the quality of technological products are going beyond utility, usability, and acceptance. That a product is suitable for its purpose, is easy to use, and fits into its intended context are considered basics from the user’s point of view. They also want a positive experience. UX is about the feelings that arise and form internally in a human through the use of technology in a particular context of use [3,9,30]. It can be defined as “the totality of the effect or effects felt by a user as a result of interaction with, and the usage context of, a system, device, or product, including the influence of usability, usefulness, and emotional impact during interaction and savouring memory after interaction” ([3], p. 5). This means that it is not possible to guarantee a certain UX, since it is the subjective inner state of a human. Although, by designing a high-quality interaction with the intended users and the usage context in mind it is possible to impact the experience [1].
The concept of UX embraces pragmatic as well as hedonic quality [4]. Pragmatic quality is related to fulfilling the do-goals of the user, which means that the interactive product makes it possible for the user to reach the task-related goals in an effective, efficient, and secure way. In other words, pragmatic quality is concerned with the usability and usefulness of the product. Hedonic quality, on the other hand, is about the be-goals of the user. Humans have psychological and emotional needs, which should be addressed by the interactive product. The user can, for instance, find the product cool, awesome, beautiful, trustworthy, satisfying, or fun. The product can, for example, evoke feelings of autonomy, competence, and relatedness to others [3,4,31]. The UX perspective includes not only functional aspects, but also experiential and emotional issues. It focuses on the positive; beyond the mere absence of problems. Additionally, a main objective of the field should be to contribute to the quality of life of humans [1,10].
Like all other interactive products for human use, user interaction with and perception of socially interactive robots evoke feelings of different natures and intensities. The user can feel motivated to walk up and use a robot. They can experience a weak distrust of the robot and at the same time be curious of it. A user can find a robot to be well-adapted and highly useful after long-term use, although, initially experienced it as being a bit strange and tricky. A robot can be found to be really fun and entertaining for young children, but boring for teenagers. Thus, UX has many facets and is complex. Therefore, it is important to identify and characterize what kind of feeling is especially important for this particular socially interactive robot to arouse. Then it is possible to consciously design the robot with those feelings as the target and it is possible to evaluate if the robot can be expected to awaken the intended experience in the user [1].

2.4. Human Touch

Although often overlooked, the sense of touch is an important communication channel with a fundamental role for human bonding and social interactions (e.g., [32,33]). In fact, physical contact can sometimes be more powerful than language, conveying strong vitality and immediacy [34]. An ethnographic study has shown that touch has powerful benefits, promoting trust in humans, which is a central component of long-term cooperative bonds [35]. Most of us are familiar with and recognize the encouragement in a pat on the back from a person of trust, just as we are familiar with the feeling of anxiety when being unexpectedly touched by a stranger on the bus. Undoubtedly, even brief touches can elicit strong emotional experiences [36].
An adult human body has approximately 18,000 square centimeters of skin [13], making the skin the largest of the human’s sense organs [36]. Schanberg (1995, in [37], p. 68) expresses that “… touch is not only basic to our species, but the key to it.” In fact, human beings already respond to touch as a fetus, receiving tactile stimulation through the mother’s abdominal wall (e.g., [38]). Once born, touch is the main interaction channel between caregiver and child and this caregiving touch has been shown to be essential for the child’s growth and development (e.g., [39]). Touch is also critical for building emotional bonds, which in turn relates to emotional and intellectual development [37]. When neglected of these physical and emotional bonds, the cognitive and neural development is negatively affected (e.g., [40,41]).
Touching brings positive physiological and biochemical effects, such as decreased levels of the stress hormone cortisol, increased levels of oxytocin, which is linked to increased well-being and anti-stress effects, and decreased heart rate (e.g., [42,43]). Touch has also been shown to affect behavior and attitude, e.g., a brief touch on the arm has shown to positively affect the receiver’s attitude towards the person they have interacted with (e.g., [44,45,46]). For example, students who had been briefly touched on their wrist displayed not only improved attitudes toward the instructor and the lecture, but also an increased motivation to study [47].
Touch has an important role to play in interpersonal interaction, which is, more than other senses, universally known across cultures [32]. However, differences in touching behavior are complex and not very well understood. Many years of research have shown differences in the amount of touching in interpersonal interactions, differences in touch initiation, where on the human body the touching occurred, etc. The findings are not consistent and the differences have been described as due to the context (e.g., [48]), culture (e.g., [13]), dominance and status (e.g., [49]), belonging to a certain gender (e.g., [49,50]), age group (e.g., [51]), and relationship [52].
Despite the inconclusive research on touch behavior, it is clear that touch is a central part of human life [33] that plays an important role in human development, shaping and characterizing emotional, relational, and cognitive functioning of the human being. Accordingly, we argue that natural interaction between humans and robots needs to enable affective touch, especially in regard to the social robots that are designed to engage in social-human interaction.

2.5. The Role and Relevance of Touch in Social HRI

Several robots designed with tactile sensors can be found in the literature. Most notable are perhaps the small, animal shaped robotic companions with full-body sensing, designed to detect affective content of social interaction, for example, the robot seal Paro [53], Huggable [54], and the Haptic Creature [55]. These robots are small in size, which allows them to be picked up and held, and designed for HRI applications in companionship and therapeutic interventions. The Huggable has “sensitive skin” that features four modalities of somatic information: pain, temperature, touch, and kinesthetic information [54]. The Haptic creature has touch sensors and an accelerometer that allow the robot to sense when being touched and moved [55]. Other work has been done to study how people try to touch robots [56,57]. Noda et al. [57] conducted a field experiment with the Robovie robot, asking participants to play with it. Their findings revealed three spontaneous haptic interaction behaviors: patting on the head, hugging, and touching the robot body and that soft skin on a robot tends to invite humans to engage in even more touch interactions, and therefore it is of major importance that the robot can process haptic feedback. Knight el al. [56] aimed to populate a social touch taxonomy by observing social gestures demonstrated by humans and then implementing pattern recognition algorithms, studying how different socially laden gestures (like a hug) as well as local gestures (like a poke) could be detected from contact between a human and a teddy-bear body. They argued that there is a locational significance to touching an anthropomorphically shaped robot body in order to infer the symbolic value of touch, instead of detecting sensor-level touch that much prior work has focused on. Yohanan and MacLean [55] examined how humans communicate emotional states through touch to the touch-centric social robot and their expectations of its responses. A user study was conducted where participants selected and performed touch gestures to use when conveying nine different emotions to the robot. The major findings are patterns of gesture use for emotional expression; physical properties of the likely gestures; expectations of the robot’s response to mirror the emotion communicated; and analysis of the user’s higher intent in communication. The findings also reveal five tentative themes of “intent” that overlap emotion states: protective, comforting, restful, affectionate, and playful. Their obtained results may support the future design of social robots by clarifying details in affective touch interactions between humans and robots [22,55].
When it comes to humanoid robots, the capability to recognize affective touch may be even more important due to the humanoid form which might elicit expectations of a high degree of social intelligence. Cooney et al. [58] studied users’ affectionate behaviors toward a humanoid robot with capabilities of touch, vision, and sound. In this multimodal interaction, the users were free to interact with the robot in any way they liked and were asked to describe how much affection they communicated by their behavior. The results show that touch was considered significantly more important for conveying affection than distancing, body posture, and arm gestures. Thus, touch plays an important role in the communication of affection from a human being to a humanoid robot [22,58].
In another study, Cooney et al. [59] investigated how users touched a humanoid robot when conveying affection, i.e., positive feelings of love, gentleness, regard, and devotion. This was then used to build a recognition system that can recognize people’s affectionate behaviors by combining touch and vision-based approaches. In order to identify typical touches, participants were instructed to convey various intentions and emotions and, for each touch, describe the degree of affection conveyed by the touch. A total of 20 typical touch gestures were identified, of which hugging, stroking, and pressing were the most affectionate, patting, checking, and controlling were neutral touch gestures, whereas hitting and distancing were unaffectionate. Thus, affective touch, as fundamental in human communication and crucial for human bonding, is likely to take place also in the interaction between humans and social robots. It should therefore be considered important for the realization of a meaningful and intuitive interaction between human beings and robots. This is in line with the findings presented by Lee et al. [17], which showed that physically embodied robots were evaluated as having a greater social presence than disembodied social robots (a screen character version of the robot). However, the physical embodiment alone did not cause the positive experience. In fact, when users were prohibited from touching the physically embodied robot, they evaluated the interaction and the robot’s social presence more negatively than when they were allowed to interact with the robot via touch. This suggests that tactile communication is essential for successful social interaction between humans and robots [17] and that the fundamental role of tactile interaction in interpersonal relationships goes beyond human–human interaction and extends to human–robot interaction [22].

3. Method

In this study, we have investigated how users convey emotions via touch to the humanoid robot Nao [20]. We report in this paper a subset of this work, focusing on the user’s experiences from interacting with the robot and the aspects that affect the experience and the touch behavior. The study takes inspiration from the human–human interaction research performed by Hertenstein et al. [16], where participants were asked to, via touch, convey eight different emotions to another person.

3.1. Participants

Sixty-four volunteers between 20 and 30 years of age participated in the experiment (32 men and 32 women). All were recruited via fliers and mailing lists at the University of Skövde in Sweden and received a movie ticket for their participation. The majority of participants were staff or students at the university. 47 participants were Swedish speakers and the remaining 17 were English speakers. No participant reported having previous experience of interacting with a Nao robot.

3.2. Procedure and Material

The study took place in the Usability Lab at the University of Skövde, which consists of a medium-sized testing room furnished as a small apartment and an adjacent control room. The testing room is equipped with three video-cameras and a one-way observation window. The adjacent control room allows researchers to unobtrusively observe participants during studies and is fitted with video recording and editing equipment. The participants entered the testing room to find the Nao robot standing on a high table (Figure 1).
Following Hertenstein et al. [16], eight different emotions were displayed in a random order on individual slips of paper. These emotions consisted of five basic emotions: anger, disgust, fear, happiness, sadness; and three pro-social emotions: gratitude, sympathy, love. A built-in functionality, “Autonomous Life”, was active on the Nao robot [20] during the experiment. This was comprised of simulated breathing, turning its head towards participant’s faces and the changing of eye color to give the impression of blinking. The robot was in other respects passive in a standing position during all interactions.
The participants were instructed to read each emotion from the paper slips, think about how they wanted to convey the specific emotion, and then to make contact with the robot’s body, using any form of touch they found to be appropriate to convey the emotion. To preclude the possibility to provide non-tactile clues to the emotion being conveyed, the participants were advised not to talk or make any sounds. Participants were not time-limited as this might impose a constraint on the naturalness or creativity of the emotional interaction. Following the setting used by Hertenstein et al. [16], the study’s purpose, background story, or context was not communicated to the participants in advance.
One of the experimenters was present in the room with the participant at all times and another experimenter observed from the control room. All interactions were video-recorded. The recordings of tactile displays were analyzed and the four main touch components were coded for each individual subject interactions following Hertenstein et al. [16]: touch intensity, touch duration, touch type, and touch location. The outcome from this has been reported in other papers [21,22,60].
In addition to the experimental design based on Hertenstein et al. [16], there was a purpose to investigate the participants’ subjective experiences of interacting with the robot via touch and explore what aspects may influence the experience as well as the touch behavior. To obtain this, a questionnaire and an interview with open-ended questions was created to get the kind of data needed for this purpose. Using standard questionnaires, such as Godspeed [61] and NARS [62], was not adequate, since they focus on measuring certain concepts. Neither the measuring nor their concepts were in line with the purpose. This part was conducted at the end of the experimental run. Quantitative data was collected by the questionnaire, which was administered to the participants and comprised of two questions relating to their experience from participating in the experiment. The participants’ replies were reported on two different five-point rating scales. Each question was answered eight times, once for each emotion (question 1–question 2). Qualitative data was also collected from all participants with the use of six open-ended interview questions regarding the participant’s UX and exploration of influencing aspects (question 3–question 8). The interviews were conducted by the first two authors. The questions were phrased as follows:
  • How easy or difficult was it to convey the emotions via touch? 1: Very easy, 2: Easy, 3: Neither easy nor difficult, 4: Difficult, to 5: Very difficult.
  • How certain are you of the way you chose to convey the emotions? 1: Very certain, 2: Certain, 3: Neither certain nor uncertain, 4: Uncertain, 5: Very uncertain.
  • Do you think you would have conveyed the emotions in the same way if (a) it had been a human, (b) the robot had been larger, and (c) the robot had been soft?
  • Where would you place the robot on the following scale? Feminine—Neutral—Masculine
  • Do you think that you would have conveyed the emotions in the same way if you would have apprehended the robot as <feminine/masculine/of a specific gender>? [Adapt the question to the answer on question 4].
  • What human age would you attribute to the robot?
  • Have you a lot of experience interacting with children?
  • How would you summarize your experience of interacting with the robot via touch?

3.3. Data Analysis

After the experiment, the recordings were transcribed by the second author and an inductive analysis was conducted by the first author on the qualitative data from Question 3 to Question 8. An introduction to inductive analysis is provided by Thomas [63] and an elaborate exposition is available in [64]. Analytical tools and coding procedure provided by grounded theory [65] were used, in particular open coding, axial coding, and questioning. In essence, grounded theory means that an inductive analysis is conducted with the high-level aim to create a theory firmly grounded in empirical data. Patterns, themes and categories gradually emerge from the data, instead of forcing pre-defined categories or existing theories on it. In our case, due to the limitations of the experimental setting, it was not possible to use all available analytical tools and coding procedures, e.g., theoretical sampling. Hence, it was not possible to reach the aim of a coherent theory. The specific procedure was carried out as follows.
  • First, all separate answers, i.e., each answer per participant and question, were indexed with participant number and gender. The separate answers were the basic components of the analysis.
  • Then one topic memo per question (TM-Q) was created in which all separate answers to that question were placed in high-level groups. The specific high-level groups of a question were chosen based on the general nature of the question. For example, for Question 3 the high-level groups were “Yes—would probably have done in the same way”, “No—would probably not have done in the same way”, and “Unsure—express themselves ambivalently”.
  • In the topic memo for Question 8, a rough quantification was made based on an assessment of expressions regarding positive, negative, and neutral experiences in relation the robot respectively conveying emotions by touch in order to obtain a feeling for the weight of the experiences.
  • While making and walking through the TM-Qs tentative categories (TC) emerged. Three sensitizing questions were used in order to identify TCs. These were “Which experiences?”, “What influence?”, and “How do the participants reason?”.
  • Thereafter, all TM-Qs were coded with the TCs and a topic memo per TC (TM-TC) was created in which all separate answers that have been coded as a specific TC were placed.
  • Then the TCs were grouped based on their common characteristics and each group of TC was given a name. A textual description was written based on the patterns in the answers in each TM-TC and TM-Q.
  • After that the TC were reviewed and the final categories were decided, where some TCs were merged, some were given new names, and other remained in their original form.
  • Each category is either an experience or an affecting aspect, which forms the structure of the obtained results.
Inductive analysis used in qualitative research contrasts with the hypothetical-deductive approach of experimental design where hypotheses are stated in beforehand (e.g., [64]). In qualitative approaches categories or dimensions of analysis emerge from open-ended questions and/or observations as the researcher comes to understand patterns and aspects that exist in the phenomenon being studied.
Thomas [63] emphasizes that inductive analysis refers to approaches that primarily use detailed readings of raw data (in this case the interview transcripts) to derive concepts, themes, or a model through interpretations made from the raw data. The primary purpose of the inductive approach is to allow research findings to emerge from the frequent, dominant, or significant themes inherent in raw data, presenting aspects, themes or dimensions of patterns of the underlying structure of experiences that are evident in the text data [63].
Grounded theory [65] is one of the major schools of qualitative research; it provides guidance through the various steps and processes of the analysis. However, it is still up to the researcher to make sense of the interview transcripts beyond the questions raised. Questions 3–8 in our study served as addressing different perspectives of the phenomenon of interest (user experience and aspects that affect the touch behavior as well as the user experience) that via the inductive analysis is finding a way to creatively synthesize and present the revealed results.
The outcome of the qualitative analysis is then presented by the most relevant themes or aspects related to the identified aim of the study [63]. This synthesizing process is one major challenges of qualitative analysis. The ability to conceptualize and synthesize, often appears in a tendency to discover and ‘see’ what conceals behind what is actually said and done [64]. Typically, the presentation of the findings and results consists of descriptions of most important identified themes, aspects, or patterns from the data analysis [63]. Here, the outcome of the inductive analysis is presented in Section 4, organized by the most relevant themes.

4. Results

In this section, the results from the inductive analysis (c.f., Section 3.3) of primarily the interview answers, but also the questionnaire, are presented. The patterns that emerged from the qualitative data are described together with quotations—the quotations are transcripts of spoken English and translations of spoken Swedish exactly as they are phrased by the participants. Hence, compared to carefully written English they may appear to be poorly expressed—(referred to as Q) from the participants as illustrations.
  • First, the user experiences of interacting with the robot and conveying emotions by touch are described in Section 4.1.
  • Then, aspects affecting user experience and the touch behavior are presented in Section 4.2.

4.1. User Experiences

The dominant experience is negative primarily in relation to conveying emotions by touch but also in relation to the robot. There is a mix of experiences among the participants, but there are also mixed feelings within individuals.

4.1.1. Negative Experience

The negative experiences in relation to the robot as well as in relation to conveying emotions by touch are unsafe, odd, and difficult. Moreover, the experience tends to influence the touch behavior of the users, see examples of this below.
Unsafe. Several participants mention that they felt unsafe when interacting with the robot, because they were afraid to break it [Q1]. The robot gave the impression of being fragile due to its small size, its hard surface, and that it is de facto a robot, i.e., an artifact that can break (Figure 2). This in turn affects the touch patterns, e.g., less intensity and avoidance of certain touch types such as pushing [Q2].
Quotation 1: “… because I am nervous that I will break it. Therefore, I get stuck in that all the time that it is a robot. That I cannot … It is a dead gadget, I cannot push it on the side, sort of … I’m afraid that I will break it.”
(Participant 64, male, translated from Swedish)
Quotation 2: “… now I was very unsafe when I was going to push it, since you think that it is a robot, it is a thing … like it is fragile.”
(Participant 15, male, translated from Swedish)
Odd. The odd experiences range from awkward to unusual. The odd feeling arose primarily from the task at hand [Q3], i.e., expressing emotions “on command”, conveying them only by touch when it is more natural to include more modalities or not touch at all, and do it to a robot and not a human. For some emotions, it is not natural to use touch at all to convey them.
Quotation 3: “By touch? God, I feel awkward. Awkward and nervous is it like, that’s the way I would summarize it. It doesn’t feel natural.”
(Participant 16, female, translated from Swedish)
The oddness is also derived from the robot as such. Several participants are unfamiliar with robots which increases the odd feeling. The same goes for some of the robot’s characteristics, such as its small size, lack of response, and hard surface [Q4].
Quotation 4: “It felt odd. Especially since it was so small, then you perhaps can think of it as a child, but it … I still tried to think that it was an adult so that it wouldn’t feel too strange. And that it was so hard.”
(Participant 61, female, translated from Swedish)
No participant claimed that the interaction and situation felt natural, instead they gave statements like “it would have been more natural if …”. A more natural experience would have been gained if the robot had been softer [Q5], such as a living creature or a teddy bear. The hard surface and stiff limbs of the robot make it more unnatural to touch. Moreover, for some types of touch, e.g., hugging, it would have been a more natural experience if the robot had been larger (Figure 3).
Quotation 5: “… if it would have been soft, then … it would feel more real and more human like and then it would be easier to imagine that you really convey emotions because if you touch a plastic surface, it’s hard to try to transmit emotions I think.”
(Participant 2, male)
Difficult. The participants emphasize that it is difficult to convey emotions to a robot only by touching it, and some emotions are more difficult than others to convey [Q6]. In particular, the fact that participants were restricted to touch, and not allowed to use other modalities [Q7], as well as the lack of responses from the robot, made the task more difficult (Figure 4).
Quotation 6: “Sometimes very difficult, especially for the emotions like sadness or disgust because … […] because I convey it more with speaking to the person and via touch … I don’t know if it is easy to convey it via touch. […] gratitude, shaking hands, feeling like “thank you, thank you, thank you very much” is easier to convey than sadness.”
(Participant 33, female)
Quotation 7: “The most difficult thing was the non-talking part. Yes, that was really hard.”
(Participant 30, female)
There are several aspects contributing to making the task more difficult. The fact that the receiver was a robot and not human made it more difficult [Q8]. The participants thought the task had been difficult with a human receiver, but it became even worse with a robot. Facing a robot was a new situation to most of the participants, which added several difficulties. It was also unclear what kind of relation the participants were supposed to have to the robot, and since conveying emotions are relation-dependent, that issue made the task more difficult.
Quotation 8: “Difficult. Difficult … yes. I don’t find it easy to convey emotions normally so it became even harder now, with a robot. So it was overall difficult.”
(Participant 18, male, translated from Swedish)
The small size of the robot made it more difficult, mainly because of its conceived fragility which excluded some touch types for some participants and, thus, made it harder to decide how to touch.
Quotation 9: “It is very difficult then. That is a bit small so that you cannot express yourself in a normal way, actually.”
(Participant 39, female, translated from Swedish)
No participant claimed that the task or interaction was easy, instead, many participants made statements like “it would have been easier if …”. For example, they believed it would have been easier if the robot had been larger. It would also have been easier to be more precise and clearer if there was more body to touch. A larger robot with proportions that the participants were more used to would have made it easier to choose how to convey emotions. Moreover, a larger robot would not have given such a fragile impression, which also would have been easier. A soft and flexible robot would have also been easier. If it had been more natural, which in itself would make the situation easier, and also easier to physically feel that you are doing it correctly [Q10]. If the robot had provided a more interactive response to the touches, the participants think that it would have been easier to convey emotions.
Quotation 10: “… though I think it would have been easier [if soft] to feel that it was correct perhaps. Because it would have felt more natural since you are used to interact with soft creatures […]. So it would have felt more like it was … that it was correct the way you did it.”
(Participant 38, male, translated from Swedish)

4.1.2. Positive Experience

The positive experiences in relation to the robot varies from a weaker “nice”, to a stronger “interesting and fun”. Out of all the answers, interesting and fun occurred more frequently among the participants compared to nice.
Interesting and fun. The interesting and fun experiences in relation to the robot are influenced by the response from the robot, i.e., its sound and motions, and the fact that it is new to most of the participants to encounter a robot [Q11]. Some participants also highlighted that they were fond of robots which made them more predisposed to have a positive experience (Figure 5).
Quotation 11: “And then you are a little bit surprised that suddenly the robot turns to you, but it’s a new experience and it is cool. I really liked it.”
(Participant 41, female)
The interesting and fun experiences in relation to conveying emotions by touch originate from the fact that they find it to be a pleasant exercise, like a puzzle for them to work out.

4.1.3. UX Dimensions

This is summed up into four dimensions of experiences that are particularly important for the participants when interacting with robots; these are in its positive form: safe, natural, easy, as well as interesting and fun. Their negative forms are: unsafe, odd, difficult, as well as uninteresting and boring (Figure 6).

4.2. Aspects Affecting the User Experience and Touch Behavior

Achieving a positive user experience is a matter of intertwined aspects, stemming from the human, the robot, the interaction, the task, and the context. Based on the patterns in the interviews, several aspects that influence the UX as well as touch behavior have been identified. There are four dimensions of touch behavior that have been taken into account in this study, in line with the Hertenstein et al. [16]. These are:
  • Touch intensity
  • Touch duration
  • Touch type
  • Touch location
The aspects influencing the actual experiences can be grouped into: (a) individual characteristics, (b) ways of thinking, (c) robot characteristics, and (d) task and interaction characteristics (Table 1). It should be noted that the aspects in these groups are not independent, rather they relate to and influence each other.

4.2.1. Individual Characteristics

In the group for individual characteristics, the included aspects are derived from how participants view themselves. The aspects are (a) to show emotions, (b) to touch, and (c) predisposal to robots.
To show emotions. Some participants (just a few) describe themselves as people who barely show emotions at all and that in general, they find it difficult to express how they feel. They find it even more difficult when interacting with a robot compared to a human [Q12]. Although these participants are not in the majority, they point to something important, namely the existence of individuals who have difficulties conveying emotion, an identified issue that also needs to be taken into account when designing emotionally interactive robots for a wider user setting (Figure 7).
Quotation 12: “… better for me that hardly show emotions at all. But I think I might have had an easier connection if it had been a human. Of course, I cannot know for sure, but I think it would have been the case.”
(Participant 4, male, translated from Swedish)
To touch others. In a similar way to the aspect of showing emotions, some participants regard themselves as not being “physical people”, i.e., they seldom or never touch other humans [Q13]. This personal attribute will probably make it more difficult and odder to interact with robots by touch compared to people that touch others more often in everyday life. Those non-touching individuals are in the minority but still need to be considered when designing for physical human–robot interaction.
Quotation 13: “Most often I don’t touch other humans at all. It is very, very unusual.”
(Participant 23, male, translated from Swedish)
Another aspect is that some participants expressed that in general they do not use touch to show emotions to other humans, preferring to use body language, facial expressions, or verbal language [Q14]. Others pointed out that it depended on which emotion they were conveying whether or not they would use touch in everyday life. Anger, disgust, and fear in particular were considered as being non-touching emotions, while love is an emotion usually conveyed by touch in the real world. Overall, conveying emotions only by touch was experienced as odd and difficult (Figure 8).
Quotation 14: “… because I think I would use my face to show the human what I feel. I am not a person that touch much, that’s why I think I would most show the things with my face, not with my hands.”
(Participant 48, female)
Predisposal to robots. Some participants highlighted that they were interested in and fond of robots [Q15]. This could have an effect on their expectations and experience of their encounters with robots (Figure 9). Regarding this aspect, the data did not point at any specific UX dimensions, instead only UX in general.
Quotation 15: “I would say that it was nice. I liked it. When I was a kid, I also had a robot. So it kind of reminds me, because it was about the same size. I don’t know. I liked it. I would say that it was a good experience. I like robots.”
(Participant 3, female)

4.2.2. Ways of Thinking

The aspects that are grouped in ways of thinking are concerned with how participants consciously or unconsciously think about or perceive the robot and the interaction, i.e., what kind of mental images are influencing the interaction. The aspects are (a) point of reference: human being, (b) relationship, (c) point of reference: softness, and (d) robot gender.
Point of reference: human being. A bare majority of the participants (54%) thought that they would not have conveyed the emotions in the same way if the receiver had been a human being, whereas 27% believed that they would have done the same. The remaining 19% were not sure if it would have made any difference.
The participants who believed that they would not have conveyed emotions in the same way to a human mention several reasons for that. Some explanations were the fact that only touch was allowed and no other modalities, and the lack of response from the robot [Q19]. Another motivation was that with a human being the present kind of relationship is important for how you convey emotions and touch to another person. Moreover, other reasons were the small size of the robot, the very fact that it is a robot and the fact that the robot has a hard surface. It should also be noted that these participants said whether or not they would have shown the emotion in the same way depended on the specific emotion.
Quotation 16: “It is a bit easier when you have a human … and read facial expressions and such stuff also. It is quite hard when it just stand there and sway a bit. So I don’t think I would have done in the same way if it had been a human.”
(Participant 19, male, translated from Swedish)
Among those participants who say they would have done it in the same way, humans were their point of reference regarding how to convey emotions via touch [Q17]. This means that they imagined what they would do if the receiver had been a human being.
Quotation 17: “I imagined that it was a person there, so … sure it had been a difference in size but I think would have acted the same.”
(Participant 11, female, translated from Swedish)
Some participants also tried to imagine how they would convey emotions based on the size of an adult. The human age of the robot is attributed as adult by 30% of the participants, among whom 27% set the age as young adult (20–29 years) and 3% as adult (more than 30 years). The robot is attributed as a teenager (13–19 years) by 22% of the participants. It is attributed as a toddler (5 years or less) by 19% of the participants and as a young school child (6–12 years) by 27%. However, it is not clear whether or not this had an effect on the mental image they used for their interaction pattern. It may be the case that they conveyed the emotions without taking age into account, and thereafter attributed a certain age to the robot when explicitly asked to.
Some of the participants talked about imagining a child when deciding how to convey emotions to the robot, this means that, for instance, they tried to be over-explicit so that a child would understand. Moreover, some types of touch are not used when the receiver is a child, especially when the robot size makes the participant imagine a toddler. For example, it is considered unacceptable behaviors to push or hit a child (illegal in Sweden), which would make the participants avoid such types of touch [Q18]. Some participants found it confusing that the size of the robot was smaller than the mental child image they use for interaction. This caused a mismatch between the child reference in their mind and the actual situation (Figure 10).
Quotation 18: “The same with anger … it had also been easier because when you are angry with someone … I think it would have been easier to … Dare to grab hard and also too … now it felt like you walked to and pushed a small child and it felt like ‘no, you cannot hit a child’.”
(Participant 42, female, translated from Swedish)
The human-like attributes of the robot made it natural for some participants to imagine a human receiver, but for others it became confusing [Q19]. In those cases, they mention that the robot had human-like features but not to a sufficient extent. This caused a mismatch between the human being point of reference in their minds and the actual robot’s appearance. Primarily, it is the size that is said to cause the confusion, but the lack of softness is also mentioned.
Quotation 19: “I tried to imagine the robot as a human being but … it isn’t the same natural parts like on a human, but still I don’t think I would have done the same way. Not for all emotions anyway.”
(Participant 60, male, translated from Swedish)
Relationship. Touching is affected by the relationship between the people involved, some participants pointed out. Thus, the imagined relationship to the receiver impacted the way a person conveyed emotions to the robot by touch [Q20]. The amount of touch increased with a higher level of intimacy, and conversely, the more distant the relationship was the less a person was inclined to touch. Friendship is an example of a relationship that is on an intimacy scale (Figure 11).
Quotation 20: “Yes and no, it depends on who it is. You act differently with different people depending on how you know them. If you don’t know them at all, you don’t touch them.”
(Participant 13, male, translated from Swedish)
However, there are other kinds of relationships of matter that are mentioned by participants, such as adult-child relation, as well as cultural and linguistic closeness [Q21].
Quotation 21: “No … it depends a bit I think. Would it be someone that is same, like very close to me culturally, thus with language and so, then I don’t think so. If it had been a larger distance then I think it would have been more … but not just as much.”
(Participant 26, male, translated from Swedish)
Point of reference: softness. There were participants that tried to imagine that the robot was soft instead of hard, and, thus, had a soft robot as a point of reference for their touch patterns. This indicates that the degree of softness can have an impact on how humans interact by touch with a robot [Q22] (Figure 12), however, it is not clear from the data in what way (compared with aspect Robot characteristic: surface in Section 4.2.3).
Quotation 22: “No, since I imagined in my head that it was a soft object. That it was soft.”
(Participant 9, male, translated from Swedish)
Robot gender. A narrow majority of the participants (52% of which 25% were men and 27% were women) said that the gender of the robot did not have an impact on how they conveyed emotions by touch. They believed they would have acted the same way no matter the gender of the robot. However, many (33% of which 17% are men and 16% are women) still thought that they acted robot-gender dependent. The remaining 16% were not sure whether or not the gender of the robot made any difference.
When asked to place the robots gender on a continuum masculine—neutral—feminine, 36% set it as masculine, 25% between masculine and neutral, 38% as neutral, 2% as between neutral and feminine, and none as feminine. An additional observation is that among the participants who claimed the robot to be gender neutral 25% used “he” instead of “it” when talking about the robot. The reason behind this is unknown, and it is open to speculation whether this is just a slip based on habit, if the participants consciously or unconsciously experienced an expectation to view the robot as gender neutral but revealed themselves when speaking, or for some other causes. Nonetheless, this may be important as robot gender (experienced or imagined) could change the way a human interacts with a robot via touch.
Participants who claimed that an attributed gender of the robot did not influence their interaction supported their point by stressing that it was de facto a robot and, thus, gender does not matter [Q23].
Quotation 23: “No I don’t think so. Because I was just thinking about now, what gender this robot has, and not before. So now when I look at him, I think he is masculine, but before, it did not matter. Because it is just a robot. It is not a real person.”
(Participant 47, female)
Those participants who believe that robot gender influenced their way of conveying emotions by touch brought up several aspects in which the interaction would change. They believed gender-specific zones on the robot, e.g., female breasts or hips, would change where on the robot body they would touch [Q24], but also change the intensity of touch, primarily by touching a male-looking robot harder compared to a female-looking robot. Likewise, the type of touch would change, e.g., avoid pushing or hitting a female-looking robot [Q25]. It is furthermore highlighted that some emotions are more gender-dependent than others, for instance, love [Q26], (Figure 13).
Quotation 24: “Perhaps if it had been more feminine, that it had been built like a … built feminine with breasts and everything. Then you would perhaps have avoided to poke there […].”
(Participant 7, male, translated from Swedish)
Quotation 25: “In case of to express my anger I don’t think I would slap him if he was a girl.”
(Participant 56, female)
Quotation 26: “Actually yes, I would have … I guess I wouldn’t have … well maybe the love would be a little bit different because the love I imagined was between a girl and a boy, and love between girls is more … I guess more like hugging, feeling friendly to each other, maybe also give a kiss on the cheek or something like this. But the other emotions, I would convey in the same way because it is not really a difference between men and women.”
(Participant 33, female)
No participant spontaneously mentioned that the gender of the robot had had an influence on how they chose to convey emotions by touch.

4.2.3. Robot Characteristics

The aspects that are categorized as robot characteristics are such that they are derived from how the participants experienced the robot. The aspects are (a) de facto a robot, (b) size, (c) hard surface, and (d) fragility.
De facto a robot. The fact that the receiver of the interaction was a robot affected several participants (Figure 14). They found it difficult to disregard the fact it was not a human being and the differences between humans and robots became apparent, such as the robot being a gadget which can be broken compared to humans who can stand more stable, and can understand emotions better than robots [Q26].
Quotation 26: “No I don’t think so [i.e., would not convey emotions in the same way to human]. But why I don’t know or like … A human may understand feelings easier than a robot. Or like, it feels weird to interact emotionally with a robot.”
(Participant 43, male, translated from Swedish)
It being a robot made participants experience the interaction as odd and different in regard to conveying emotions. In addition, they were not used to touching a robot. This action was new and unfamiliar to them. Furthermore, some believed it would have been easier if the robot had had more human-like features.
Quotation 27: “The more alike a physical human it was … then it would in some way be easier to maybe apply a human I on it, sort of. In contrast to now when it more looks like a really nice robot toy that you had when you were a child. Like that. But it had for example been … it had been even worse if it would not have had eyes. So even if it has like some human feature, that gives something, but the closer to a real human the easier I would have had to see … that this is something a can interact with emotionally.”
(Participant 16, female, translated from Swedish)
Participants who claimed that an attributed gender of the robot did not influence their interaction with it explained that this was due to it being de facto a robot and, thus, gender did not matter.
Size. The majority of the participants (56%) supposed that they would have conveyed emotions in another way if the robot had been larger, while 27% thought they would have done it in the same way. The remaining 17% were unsure whether or not a larger size would have made any difference.
The small size of the robot had an effect on the user experience as well as the touch. Among all participants, i.e., those who believed that the size influences their behavior, those who did not, as well as the uncertain ones, many claimed that if the robot had been larger, they would have found the interaction easier and more natural. They also thought that they would have felt more certain on how to convey the emotions and that the touch would have been more distinct if the robot was larger. A reason for believing that it would have been experienced as easier and more natural is that several participants had humans as a point of reference for their way of conveying emotions by touch and the small size of the robot caused a mismatch with this way of thinking [Q28].
Quotation 28: “In particular […], but if it had been larger, like human-size, it had been much easier on the uptake and know where you should … how you should behave.”
(Participant 19, male, translated from Swedish)
The smallness of the robot made it appear fragile, which reduced the participants’ intensity of touch and made them avoid certain touch types, such as slapping the robot on its back. Moreover, the hands of a human are big in comparison with the robot. This makes it more difficult to get hold of the intended places on the robot and makes the touches more precise [Q29]. For some participants the small size of the robot made them think of it as a small child, which also restrained them from using touch in the same way as they would with an adult (Figure 15).
Quotation 29: “No I would probably not have done it in the same way. Sort of the thing with the head and shoulders and such, that it was very tight and you sort of could not get hold of with the touch, I felt. I felt like it was a bit difficult. And then like … it is difficult to touch something that is small in the same way as something big, since … the hands also becomes too big, I think. It becomes more like that you … have to pick at it sort of. Something like that.”
(Participant 51, female, translated from Swedish)
Surface. The hard surface of the robot seems to influence, to some extent, the way emotions are conveyed, 44% of the participants believed that they would have done in another way if the robot had been soft. However, almost the same number, 42%, thought that a soft robot would not have made any difference, whereas 14% were unsure if the robot’s softness would have changed the way they conveyed emotions or not.
Several of those who thought they would have conveyed emotions in the same way said that the reason for this argument was that their point of reference was soft, i.e., they tried to imagine that the robot was soft. Although they would have used the same touch patterns, they mentioned that the experience of it would have been different in terms of a more pleasant, easier, and more natural interaction [Q30]. This is aligned with those participants who thought softness would have changed their touch patterns believed how they might have experienced it, i.e., that the hard surface makes it more difficult and unnatural. In addition, they expressed that they would have felt that the robot would have been more solid, which in turn would have made them feel safer.
Quotation 30: “Yes [i.e., would have conveyed emotions in the same way], although I think it would have been easier to feel that it was correct maybe. Because it had felt more natural because you are more used to interact with soft creatures, so to speak. So it would perhaps have felt more that it was … it was correct the way you did it, so to speak.”
(Participant 38, male, translated from Swedish)
They also mention that soft parts invited touching, affecting the touch behavior. The duration of touch was believed to be prolonged. The intensity of touch was thought to increase, and the locations and types of touch was supposed to be different. In addition, it is noteworthy that the participants believed that the effect on the touch behavior is dependent on the type of emotion, for instance, the duration of touch was thought to be affected by positive emotions but not negative ones [Q31] (Figure 16).
Quotation 31: “Then I think I would have touch even more. Maybe I would have made similar gesture but it would probably be more time touching. I don’t know how to say this … Like more likable to touch so maybe I spend more time … like if I want to express sympathy I will be more like ‘ahhh, it’s so soft’.”
(Participant 36, female)
Fragility. The fact that the robot is small and hard made it appear fragile, which in turn had an effect on the intensity of touch [Q32]. The participants reduced their intensity due to this sense of fragility. They were, for instance, afraid that it would fall and break [Q33]. As a consequence, the participants experienced a lack of safety during the interaction (Figure 17).
Quotation 32: “If it had been larger then you of course think that it is more solid, for some reason. I think that way, anyhow. Then I would probably have been more heavy-handed on those parts … Perhaps dared to grab it harder if you hugged it, and such.”
(Participant 5, male, translated from Swedish)
Quotation 33: “No probably it would be easier because I had the sensation that it is too fragile so it’s like I don’t want to touch too much or interact much because maybe it will fall down or break something.”
(Participant 37, male)

4.2.4. Task and Interaction Characteristics

The aspects that are categorized in task and interaction characteristics are those that relate to the participants conducting the task at hand. In general, task and interaction can be separated from each other, but in this experimental setting they are intertwined and, thus, are analyzed and presented together. The aspects are (a) type of emotion, (b) response, (c) modality, and (d) first time encounter.
Type of emotion. The participants experienced the emotions differently. In general, the participants found the positive emotions easier to convey than the negative ones, and they were more certain that they would choose to convey the positive emotions in the same way again in comparison with the negative emotions (Figure 18).
The participants often mention that things depended on the emotion, for example they would have conveyed one specific emotion in the same way if the receiver had been a human being instead of a robot and for another specific emotion they would have chosen to do it differently. However, there are no clear patterns regarding the separate emotions apart from one aspect, namely the gender of the robot. The emotions love and anger tend to be gender-dependent, i.e., the experienced gender of the robot influences the ways these emotions are conveyed [Q34]. The emotions sympathy, gratitude, sadness, fear, and disgust are more gender-independent, i.e., the experienced gender of the robot does not affect the touch patterns for conveying the emotions [Q35]. On the other hand, happiness has an ambivalent pattern and it is unclear whether it is dependent or independent of the perceived gender (Figure 19).
Quotation 34: “No, I wouldn’t have punched it if a saw it as feminine. I would have slapped it [laughter]. But others, yeah. So just the anger-part.”
(Participant 62, female)
Quotation 35: “Maybe something like anger in a different way but the rest … or something like love … but yeah I think more or less the same, yes.”
(Participant 41, female)
Response. The response from the robot was of major importance to the participants regarding the touch pattern as well as the experience. The only available response in this experiment was the robot’s face that followed the participant’s face in an autonomous life mode and made some beeps. It did not actively respond to the participants’ touch. In addition, the robot’s limbs were stiff and non-flexible, and it moved its body slightly to simulate breathing.
Conveying emotions is viewed as being an interactive activity, which affects how the participants conveyed emotions to the robot compared to other humans. The initial touch did not differ, instead, it was the following touch pattern that was dependent on the response from the receiver. The participants commented that they, in real life, gradually adapt their behavior, for example, increasing, decreasing, or discontinuing the touch, based on interpretation of the response [Q36]. It also seems that the need for response is more important for some emotions than others, although, it is not clear which emotions are more response-dependent than others.
Quotation 36: “If it had been a human then I may have acted a bit differently because then I would have expected some sort of response to further build upon with gestures. Now it was more like … it felt like … now it was one action … that it was … I was limited to. So, it would perhaps be a bit different.”
(Participant 9, male, translated from Swedish)
The stiffness of the robot’s limbs affected the touch pattern, since the non-flexibility made it difficult to use certain touch types, such as shaking the robot’s hand in a natural-like way [Q37].
Quotation 37: “[…] But I felt that for example the arm were a little bit rigid for some, I think it was gratitude or joy, what I wanted to do was raise his arms and do [inaudible] but the arms were a little bit hard so … I would try the same with a human, but I think with a human it would be easier because a human would probably follow my movement.”
(Participant 40, male)
Those participants that had a fun experience often attribute the response from the robot, although limited, as one of the reasons [Q38]. For others, the actual response was not enough for a positive experience. Instead, the lack of interactive response to the touch as well as the stiffness made the experience negative as it felt odd and unnatural (Figure 20).
Quotation 38: “It is definitely something different. It is fun because when I touched it, I had the feeling that, in some way it responded some times. Not always, but some times. I wasn’t really sure how strongly I should touch it but … yeah it was a nice experience. Just to try it.”
(Participant 2, male)
Modality. The setting of the experiment, i.e., only conveying emotions via touch, was experienced as problematic by the participants. Conveying emotions in this way becomes artificial and constructed, since, as stressed by many participants, if it had been a natural setting and a human receiver, then a range of modalities would have been used, such as facial expression, eye gaze, gestures, body language, words, and tone [Q39].
Quotation 39: “Confusing. I mean interacting with the robot is not the thing that is confusing, it’s the exercise to try to express emotion without talking and without getting any response. The facts of interacting with the robot is not the confusing part because it is really like humanize, like it has the hands, the face, you can really see the different parts but I think in general it is really confusing for me to express emotion and if I cannot even talk or use my gestures, then it’s … I felt really lost trying to communicate in that way.”
(Participant 36, female)
Participants found it difficult, odd, and unnatural to use touch only, which forced them to consciously reflect on how to convey the emotions. Thus, the touching did not come automatically, which in turn makes it questionable if the shown touch patterns were the ones the participants would use in a natural setting [Q40].
Quotation 40: “It was cool. But it was difficult. In general, it is a bit difficult because you have to think carefully, otherwise it happens more instinctively with ha human or so, maybe talk a bit. But then this is also fun, because it is something new. Yes … that’s about it.”
(Participant 25, male, translated from Swedish)
Furthermore, emotions are differently multimodal-dependent, i.e., some emotions are more tactile in their nature than others. Some emotions seldom involve touching at all, e.g., disgust [Q41], whereas shaking hand can be natural when conveying gratitude (Figure 21).
Quotation 41: “Some parts of it was a bit uncomfortable since they are emotions that you in general try to avoid. Disgust I found very difficult. For that I had to think of for a while. But for disgust in particular it was … trying to convey it by touch, it is a contradiction in itself. I think.”
(Participant 7, male, translated from Swedish)
Encounters. For several participants, this was their first encounter with a humanoid robot. Hence, there is a first-time experience to take into account, since their experience may change after more long-term and frequent encounters with social robots (Figure 22).
Some participants were thrilled to meet and interact with a robot for the first time and they found it interesting and fun [Q42]. Others experienced it more negatively as they felt insecure [Q43].
Quotation 42: “It felt really responsive in some way. That it … I don’t know, but when I shaked it it gave some beep sounds and such. It felt exciting and such … that it was sort of smart. And the experience was also that it is exciting to meet such a robot, or touch it. You have seen clips and such and you know they are really advanced … and also very expensive. So this was cool.”
(Participant 1, male, translated from Swedish)
Quotation 43: “A little bit strange because I am not used to do something with a robot. I don’t know how can I touch him, so it has been strange, especially at the beginning.”
(Participant 34, female)

5. Discussion

The use of social robots in society is expected to gradually increase [2]. A prerequisite for this development is that the robots are positively experienced by their intended users, otherwise the robots risk being dismissed. It is therefore imperative to deepen the understanding of the complexity of UX, its underpinnings, and aspects that affect it. In addition, those aspects influence the interactive touch patterns of humans. The basis of this paper is an experiment on affective tactile HRI that investigates how humans convey emotions only by touch to robots, with the purpose of increasing the comprehension of how users experience this particular kind of interaction and which aspects influence the experience and their touch behaviors. This paper provides a step towards such an understanding by taking the UX perspective of the humans in a specific kind of interactive setting.
Not surprisingly, there are four dimensions of experiences that are particularly important for users when interacting with robots based on the results of this study. These are in its positive form: safe, natural, easy, and interesting and fun. Their negative forms are: unsafe, odd, difficult, and uninteresting and boring. Often, it is not possible, particularly with limited resources, to put equally high efforts in every aspect that could potentially impact the user experience. Instead, it must be decided which particular experiences are most vital to awaken, and carefully design for and evaluate with these in mind [66]. The four identified UX dimensions can thus be regarded as prioritized UX goals, although, most likely not the only important ones.
Furthermore, an interesting observation in our study is that the dominant experiences are negative, in contrast to what can be expected from preceding research by Lee et al. [17], Ogawa et al. [18], and Turkle et al. [19] showing that humans want to interact with robots via touch. However, this is not necessarily contradictory. An expressed preference for communicating by touch can be true alongside a current failure to set up an interactive touch mode that evokes an overall positive experience in the users. This underlines the importance of careful interaction design in relation to UX, since emotional touch in itself does not automatically induce positive experiences.
Regarding the UX it became apparent that many users have a mixture of experiences, not only in terms of which UX dimensions that are involved but also a blend of positive and negative experiences during the interaction—even in cases where the interaction is non-complex and relatively short. It is thus reasonable to assume that a complex and long-term use will involve more UX dimensions and shifts between positive and negative. This emphasizes the importance of care being taken when evaluating users’ experiences of social human–robot interaction. To merely ask users if their experience is positive or negative will be too blunt and probably will not provide an informative result. As such, we see a tendency in HRI research to treat UX superficially [66]. Instead, the evaluation procedure needs to allow a broader scope of experiences. Examples of such broader scope can be found in Young et al. [67], Weiss et al. [68], and de Graaf and Allouch [8].
The important question is to understand what influences how the tactile interaction unfolds as well as how the human experiences it. It becomes clear in this study that there is an interplay among many affecting aspects, which makes it difficult to study aspects in isolation (Figure 23).
The aspects found in this study were divided into four groups: individual characteristics, ways of thinking, robot characteristics, and task and interaction characteristics. All groups influenced the UX and the touch behavior, but they also influenced each other. In addition, to make it even more complex, the UX in itself affected the touch behavior. For instance, the robot characteristics influence the humans’ ways of thinking, which in turn impact the touch behavior, e.g., the size of the robot affect the way humans imagine the ‘human age’ of the robot, and the attributed age impacts on how to touch the robot since there are different patterns between how to touch a toddler compared with an adult. Another example is that certain actual robot characteristics—e.g., small and hard—influence another perceived robot characteristic, fragility, which causes the human to experience the interaction as unsafe. Furthermore, the unsafe experience has an effect on the touch behavior in terms of reduced touch duration and intensity. Thus, accepting the complexity of influencing aspects is a necessity. The influence and importance of individual variance on how emotions are conveyed is in line with the findings of Silvera-Tawil et al. [12], whose results also show that individual interpretations and the context of interaction affect tactile communication of emotions.
Females touch other people on more regions of their body than males in their tactile interaction [21,69]. From the interviews another gender issue was revealed. There are indications that an attributed gender of the robot also triggers difference in emotional touch behavior, which calls for further investigation concerning gender effects on tactile interaction regarding both robot gender in itself and the relations between human and robot gender. Prior research by Tay et al. [70] has shown that gender aspect has an effect on user acceptance in HRI, presenting findings that imply that gender interacts with occupational roles (security versus healthcare) and participants preferred matching of gender-occupational roles and personality-occupational role stereotypes. However, personality-occupational role stereotypes showed a stronger effect on user acceptance than gender-occupational role stereotypes.
The experiences of conveying emotions to the robot by touching it differs between emotions. The users in the present study found it easier and were more certain when communicating positive emotions (gratitude, happiness, love, and sympathy) compared to negative emotions (anger, disgust, fear, and sadness). A strong effect was found with seven times as many users finding it difficult, or very difficult, to convey disgust, compared to love. Similar findings were made by Yohanan and MacLean [55] and Cooney et al. [58]; their participants were asked to show affection and would respectively receive positive emotional response from the robot, but not expressing negative emotions such as hate and disgust or correspondingly get negative emotions in return. Taken together, this indicate that touch can be considered an interactive modality primarily associated with the communication of positive emotions, while other modalities may be the preferred choice when communicating negative emotions to a robot.
These results could contrast with studies reporting observations of aggressive behavior against humanoid-robots. For example, Brščić et al. [71] reported that children spontaneously show abusive behavior, including kicking and hitting, towards a social robot acting in a public environment. Similarly, Salvini et al. [72] observed younger people kicking, punching and slapping their robot acting in an urban environment. Nomura et al. [73] reported that children that show abusive behavior towards robots do not see the robot as just a machine, but a human-like entity. As such, there is an association between children’s abuse of robots with human/animal abuse. These reports constitute clear examples of physical/tactile interaction carrying negative emotion of a kind that was not observed in the present study, possibly due to differences in age group, the robots used, or the experimental setting.
Two well-known aspects, response and multi-modality, were confirmed as particularly important for the UX in emotional interaction with social robots. Regarding response, many participants stressed a lack of response from the robot as well as its rigidness and stiffness as constraints on tactile communication. While tactile interaction is often seen as a sender–receiver communication in the information transmission metaphor (see [74] for a discussion about different metaphors on interaction), these results highlight the interactive aspect of touch, and the role of the receiver in providing a certain level of compliance and response. This follows the dance metaphor of interaction [74]. These aspects apply in design of compliance control of rigid robots but could also be relevant in relation to soft robots or robots covered by soft materials, for example textiles.
Multi-modality or, as in the setting of this study, lack of multi-modality has been emphasized by many participants to influence the UX in emotional interaction with robots. Conveying emotions seems to be a multi-modal activity in itself, which makes it difficult and awkward for the user to use touch solely. Thus, despite the fact that Hertenstein et al. [16] reported high information content in affective tactile communication between humans, the participants in this study expressed a desire to use other modalities than touch, explicitly referring to, e.g., facial and vocal communication. We believe this should be understood as an argument for the need to consider multi-modality in the interaction design of social robots.
The study also showed that participants thought about the robot in a variety of ways, which affected their touch behaviors. A number of participants had humans in mind when interacting, some did not. The participants attributed different ages, genders, and relations to the robot. Some thought of the robot as soft, others did not. The many different ways of thinking impose a wide range of ways users chose to convey emotions, which in turn will likely make it difficult for robots to correctly decode the affective tactile interaction. By creating clear external conceptual models, it is possible to influence the thinking of the users [75]. Therefore, when designing human–robot interaction we argue that it is important to guide users’ understanding of the robot, so they become more coherent, and we get more predictable interaction. For example, we believe that different users are more likely to touch the robot in similar ways if there is a clear human age of the humanoid robot, e.g., a young child, compared to a robot that is designed with an ambiguous age. Of course, the touch behaviors will still differ between users, but the variety can be expected to decrease.
The participants frequently referred to the lack of context and response from the robot. It may be worth noting that these observations could also be made in relation to previous research on human–human tactile communication. For example, Hertenstein et al. [16] used a similar setting, explicitly asking participants receiving the tactile communication to remain passive. The present work indicates that this experimental setup, using a passive agent receiving tactile communication, may change the tactile interaction itself. Furthermore, the generic limitations of experiments can also be applied to this study, i.e., to what extent the obtained results hold in a natural setting. As commonly acknowledged, when it comes to interaction between humans and digital artefacts it is context-dependent, and, hence, there is a need for further studies “in the wild”.
There were several participants that explicitly described the robot as responsive, which in fact was not the case. We believe this may be related to the use of the autonomous life mode active on the robot, making it execute head movements towards the user. Although these responses were quite limited, they clearly engaged at least some users when executed at the right time. To what degree the perceived differences in interactivity were due to the functionality of the robot or different user expectations is unclear.
Few users directly expressed that they adjusted their interaction as a result of the receiver being a robot rather than a human, but several users reported other aspects, inherent from the robot design, affecting interaction. The perceived lack of robustness of the robot clearly limited the interaction, leading some users to reducing or even avoiding tactile communication they otherwise would have expressed. The unsafe experience may have been amplified by the experimental setting where the robot was standing on a table (see picture in Section 3.2).
Another aspect of the robot design frequently expressed by the users is the size of the robot. Nao is just above 57 cm high, which some users perceived as a limiting factor of interactions. Therefore, we have an ongoing work where we have extended this work to another robot platform, Pepper [20]. Pepper has a height of 120 cm and may be perceived quite differently regarding tactile interaction.
Those who participated in this study volunteered after being made aware of it via flyers and mailing lists, hence, they were not randomly selected. Thus, there is a risk of bias, e.g., being particularly interested in robots, which may have influenced the result. No specific bias has been noted in the material, nonetheless, the risk remains.

6. Future Work and Concluding Remarks

Future work includes addressing the embodied and situated aspects of interacting with humanoid robots, which have been implicitly mentioned in the users’ experiences when interacting with the robot through touch. The work reported in this study presents a short-term perspective of how the users experience the interaction. However, it lacks a long-term perspective and it would be of interest to examine whether the UX would change if the HRI occurred during a prolonged time frame, e.g., the perceived trickiness when conveying emotions to the robot through touch only. An additional aspect is the reverse, investigating how humans experience being touch by robots. Given that touch is a bi-directional modality in nature and the envisioned smooth and intuitive interaction between humans and robot implies complex interaction patterns in order to humanize robots. For example, it has been reported that a robot’s touch reduces feelings of unfairness and enhances persuasiveness [76,77].
In addition, more elaborated and structured UX studies have to be conducted in this area of HRI research where specific UX goals are identified and then repeatedly assessed [3,5,10]. Highlighting the importance of evaluating the quality of the human–robot interaction is of major concern for a user-centered perspective of social robots in HRI. Many evaluation methods and techniques have been developed resulting in evaluations of different aspects including acceptance, usability, UX, learnability, safety, trust, and credibility. While some of the aspects are covered in depth, others are only briefly touched upon in HRI research [1,66]. Lindblom and Andreasson [6] stressed that there is a need for more theoretical as well as methodological knowledge about methods and techniques that are appropriate for evaluating UX in HRI, and there is a significant need for conducting valid usability and UX studies in order to exploit UX to its full effect and to improve positive UX of socially interactive robots.
An additional issue is how cultural aspects affect the human–robot interaction. The ways robots are imagined and designed in popular Western science fiction films promote and manifest a dystopian vision of the future where these robots draw on religious themes of salvation and apocalypse from the Christian culture that dominates in the United States, in which these films are produced [78]. The imagined and perceived fear of robots envisioned in these films might potentially hinder the acceptance and use of robots in the Western world that is dominated by Christian values [79]. Japan portrays a different view regarding their role and perceptions of robots. Alesich and Rigby [78] pointed out that in Japan, the acceptance of robots is higher than in the United States, and Japanese society has been very eager to adopt humanoid robots due to socio-political reasons as well as enhanced religious ones. Japanese acceptance of robots is aligned with religious ideas derived from Shinto and Buddhism, which both consider man-made objects as a part of the natural world. This view is further strengthened in Japanese popular culture in which robots are integrated and beloved members of society. That religious beliefs can have an effect on how persons convey emotions by touch are also indicated by the results of Silvera-Tawil et al. [12]. As stressed by Alesich and Rigby [78] robot designers are often bringing social, moral and aspirational biases to their designs, beyond being narrowly focused on technological achievements and verifying their own designs, instead of considering the social implications of their view of robots as machines, tools, divine creatures, or beloved companions.
To conclude, we would like to point out that the UX perspective is of major concern given that this type of interactive technology allows humans to become more socially situated in the world of technical artefacts. As humanoid robots become more ubiquitous in our daily lives, humans and robots are envisioned to interact and work in common and shared physical and socio-cultural spaces. These closer spaces change the nature of human–robot interaction considerably, where touch or haptic feedback could function as useful additional modality for communication that is very natural for humans to utilize. That means the real strength of humanoid robotics is its potential to facilitate more ‘natural’ human–machine interaction, allowing humans to interact with artefacts in the ways they are used to interact with each other. There are also human expectations that experience of interacting with humanoid and other kinds of social robots should result in trust for these kinds of technologies. In order to achieve technical and social acceptance, the ways people interact with robots should be accomplished through effortless interaction envisioning smooth, natural, and intuitive interaction paradigms that, in the long run, foster public acceptance of these kinds of robots. Furthermore, studying how humans interact with humanoid robots may provide significant insights concerning the fundamentals of human–human interaction, because it is in relation to something more familiar that the unknown becomes visible and makes sense. Thus, it is a win–win situation for both perspectives, addressing what it means to be a human being.

Author Contributions

Conceptualization, B.A., R.A., and R.L.; Methodology, B.A., R.A., and R.L.; Formal analysis, B.A. and E.B.; Investigation, B.A., R.A., and R.L.; Data curation, B.A. and R.A.; Writing—original draft preparation, B.A. and R.A.; Writing—review and editing, E.B. and J.L.; Visualization, B.A.

Funding

This work has been carried out as part of the project “Design, Textiles and Sustainable Development” funded by Region Västra Götland (VGR) funding agency. The work has also been financially supported by the Knowledge Foundation, Stockholm, under SIDUS grant agreement no. 20140220 (AIR, “Action and intention recognition in human interaction with autonomous systems”).

Acknowledgments

The authors would like to thank Matthew Hertenstein for the important discussions regarding this work.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alenljung, B.; Lindblom, J. User experience of socially interactive robots: Its role and relevance. In Synthesizing Human Emotion in Intelligent Systems and Robotics; Vallverdú, J., Ed.; IGI Global: Hershey, PA, USA, 2015; pp. 352–364. [Google Scholar]
  2. Boden, M.; Bryson, J.; Caldwell, D.; Dautenhahn, K.; Edwards, L.; Kember, S.; Newman, P.; Parry, V.; Pegman, G.; Rodden, T.; et al. Principles of robotics: Regulating robots in the real world. Connect. Sci. 2017, 29, 124–129. [Google Scholar] [CrossRef]
  3. Hartson, R.; Pyla, P.S. The UX Book: Process and Guidelines for Ensuring a Quality User Experience; Morgan Kaufmann: Amsterdam, The Netherlands, 2012; ISBN 978-0-12-385241-0. [Google Scholar]
  4. Hassenzahl, M.; Roto, V. Being and doing: A perspective on user experience and its measurement. Interfaces 2007, 72, 10–12. [Google Scholar]
  5. Anderson, J.; McRee, J.; Wilson, R.; the Effective UI Team. Effective UI; O’Reilly: Sebastopol, CA, USA, 2010; ISBN 978-0-596-15478-3. [Google Scholar]
  6. Lindblom, J.; Andreasson, R. Current challenges for UX evaluation of human-robot interaction. In Advances in Ergonomics of Manufacturing: Managing the Enterprise of the Future; Schlick, C., Trzcieliński, S., Eds.; Springer: Cham, Switzerland, 2016; pp. 267–277. [Google Scholar]
  7. Weiss, A.; Bernhaupt, R.; Tscheligi, M.; Yoshida, E. Addressing user experience and societal impact in a user study with a humanoid robot. In Proceedings of the Symposium on New Frontiers in Human-Robot Interaction (AISB2009), Edinburgh, UK, 6–9 April 2009; pp. 150–157, ISBN 190295680X. [Google Scholar]
  8. De Graaf, M.M.A.; Allouch, S.B. Exploring influencing variables for the acceptance of social robots. Robot. Auton. Syst. 2013, 61, 1476–1486. [Google Scholar] [CrossRef]
  9. Hassenzahl, M. User experience and experience design. In The Encyclopedia of Human-Computer Interaction, 2nd ed.; Soegaard, M., Friis Dam, R., Eds.; The Interaction Design Foundation: Aarhus, Denmark, 2013. [Google Scholar]
  10. Hassenzahl, M.; Tractinsky, N. User experience—A research agenda. Behav. Inf. Technol. 2006, 25, 91–97. [Google Scholar] [CrossRef]
  11. Dautenhahn, K. Socially intelligent robots: Dimensions of human-robot interaction. Philos. Trans. R. Soc. B Biol. Sci. 2007, 362, 679–704. [Google Scholar] [CrossRef] [PubMed]
  12. Silvera-Tawil, D.; Rye, D.; Velonaki, M. Interpretation of social touch on an artificial arm covered with an EIT-based sensitive skin. Int. J. Soc. Robot. 2014, 6, 489–505. [Google Scholar] [CrossRef]
  13. Montagu, A. Touching: The Human Significance of the Skin, 3rd ed.; Harper & Row: New York, NY, USA, 1986; ISBN 13:978-0060960285. [Google Scholar]
  14. Dahiya, R.S.; Metta, G.; Valle, M.; Sandini, G. Tactile sensing-from humans to humanoids. IEEE Trans. Robot. 2010, 26, 1–20. [Google Scholar] [CrossRef]
  15. Silvera-Tawil, D.; Rye, D.; Velonaki, M. Artificial skin and tactile sensing for socially interactive robots: A review. Robot. Auton. Syst. 2015, 63, 230–243. [Google Scholar] [CrossRef]
  16. Hertenstein, M.J.; Holmes, R.; McCullough, M.; Keltner, D. The communication of emotion via touch. Emotion 2009, 9, 566–573. [Google Scholar] [CrossRef]
  17. Lee, K.; Jung, Y.; Kim, J.; Kim, S.R. Are physically embodied social agents better than disembodied social agents? The effects of physical embodiment, tactile interaction, and people’s loneliness in human-robot interaction. Int. J. Hum.-Comput. Stud. 2006, 64, 962–973. [Google Scholar] [CrossRef]
  18. Ogawa, K.; Nishio, S.; Koda, K.; Balistreri, G.; Watanabe, T.; Ishiguro, H. Exploring the natural reaction of young and aged person with Telenoid in a real world. J. Adv. Comput. Intell. Intell. Inform. 2011, 15, 592–597. [Google Scholar] [CrossRef]
  19. Turkle, S.; Breazeal, C.; Dasté, O.; Scassellati, B. First encounters with Kismet and Cog: Children respond to relational artifacts. In Digital Media: Transformations in Human Communication; Messaris, P., Humphreys, L., Eds.; Peter Lang: New York, NY, USA, 2006; pp. 313–330. ISBN 978-0-8204-7840-1. [Google Scholar]
  20. SoftBank Robotics. Available online: https://www.softbankrobotics.com/emea/en (accessed on 5 November 2018).
  21. Andreasson, R.; Alenljung, B.; Billing, E.; Lowe, R. Affective touch in human-robot interaction: Conveying emotion to the Nao robot. Int. J. Soc. Robot. 2018, 10, 473–491. [Google Scholar] [CrossRef]
  22. Alenljung, B.; Andreasson, R.; Billing, E.A.; Lindblom, J.; Lowe, R. User experience of conveying emotions by touch. In Proceedings of the 26th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN, Lisbon, Portugal, 28 August–1 September 2017; pp. 1240–1247. [Google Scholar] [CrossRef]
  23. Goodrich, M.A.; Schultz, A.C. Human-robot interaction: A survey. Found. Trends Hum.–Comput. Interact. 2007, 1, 203–275. [Google Scholar] [CrossRef]
  24. Dautenhahn, K.; Saunders, J. Introduction. In New Frontiers in Human-Robot Interaction; Dautenhahn, K., Saunders, J., Eds.; John Benjamins Publishing Company: Amsterdam, The Netherlands, 2011; ISBN 978-90-272-0455-4. [Google Scholar]
  25. Dautenhahn, K. Human-Robot Interaction. In The Encyclopedia of Human-Computer Interaction, 2nd ed.; Soegaard, M., Friis Dam, R., Eds.; The Interaction Design Foundation: Aarhus, Denmark, 2013. [Google Scholar]
  26. Yanco, H.A.; Drury, J. Classifying human-robot interaction: An updated taxonomy. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics 2004, The Hague, The Netherlands, 10–13 October 2004; pp. 2841–2846. [Google Scholar] [CrossRef]
  27. Thrun, S. Toward a framework for human-robot interaction. Hum.-Comput. Interact. 2004, 19, 9–24. [Google Scholar] [CrossRef]
  28. Scholtz, J. Theory and evaluation of human robot interaction. In Proceedings of the 36th Hawaii International Conference of System Sciences (HICSS’03), Big Island, HI, USA, 6–9 January 2003. [Google Scholar] [CrossRef]
  29. Fong, T.; Nourbakhsh, I.; Dautenhahn, K. A survey of socially interactive robots. Robot. Auton. Syst. 2003, 42, 143–166. [Google Scholar] [CrossRef] [Green Version]
  30. Kaasinen, E.; Roto, V.; Hakulinen, J.; Heimonen, T.; Jokinen, J.P.P.; Karvonen, H.; Keskinen, T.; Koskinen, H.; Lu, Y.; Saarilouma, P.; et al. Defining user experience goals to guide the design of industrial systems. Behav. Inf. Technol. 2015, 34, 976–991. [Google Scholar] [CrossRef]
  31. Partala, T.; Kallinen, A. Understanding the most satisfying and unsatisfying user experiences: Emotions, psychological needs, and context. Interact. Comput. 2012, 24, 25–34. [Google Scholar] [CrossRef] [Green Version]
  32. Hertenstein, M.J.; Verkamp, J.M.; Kerestes, A.M.; Holmes, R.M. The communicative functions of touch in humans, nonhuman primates and rats. Genet. Soc. Gen. Psychol. Monogr. 2006, 132, 5–94. [Google Scholar] [CrossRef]
  33. Knapp, M.L.; Hall, J.A. Nonverbal Communication in Human Interaction, 7th ed.; Cengage Learning: Boston, MA, USA, 2010; ISBN 13:9780495568698. [Google Scholar]
  34. Jones, S.E.; Yarbrough, E. A naturalistic study of the meanings of touch. Commun. Monogr. 1985, 52, 19–56. [Google Scholar] [CrossRef]
  35. Kraus, M.W.; Huang, C.; Keltner, D. Tactile communication, cooperation, and performance: An ethological study of the NBA. Emotion 2010, 10, 745–749. [Google Scholar] [CrossRef] [PubMed]
  36. Gallace, A.; Spence, C. The science of interpersonal touch: An overview. Neurosci. Biobehav. Rev. 2010, 34, 246–259. [Google Scholar] [CrossRef] [PubMed]
  37. Field, T.M. Touch, 2nd ed.; The MIT Press: Cambridge, MA, USA, 2014; ISBN 13:978-0-805-81890-1. [Google Scholar]
  38. Lagercrantz, H.; Changeux, J.-P. The emergence of human consciousness: From fetal to neonatal life. Pediatr. Res. 2009, 65, 255–260. [Google Scholar] [CrossRef] [PubMed]
  39. Feldman, R.; Eidelman, A.I.; Sirota, L.; Weller, A. Comparison of skin-to-skin (kangaroo) and traditional care: Parenting outcomes and preterm infant development. Pediatrics 2002, 110, 16–26. [Google Scholar] [CrossRef] [PubMed]
  40. Beckett, C.; Maughan, B.; Rutter, M.; Castle, J.; Colvert, E.; Groothues, C.; Kreppner, J.; Stevens, S.; O’Connor, T.G.; Sonuga-Barke, E.J.S. Do the effects of early severe deprivation on cognition persist into early adolescence? Findings from the English and Romanian adoptees study. Child Dev. 2006, 77, 696–711. [Google Scholar] [CrossRef] [PubMed]
  41. MacLean, K. The impact of institutionalization on child development. Dev. Psychopathol. 2003, 15, 853–884. [Google Scholar] [CrossRef]
  42. Coakley, A.B.; Duffy, M.E. The effect of therapeutic touch on postoperative patients. J. Holist. Nurs. 2010, 28, 193–200. [Google Scholar] [CrossRef] [PubMed]
  43. Ditzen, B.; Neumann, I.D.; Bodenmann, G.; von Dawans, B.; Turner, R.A.; Ehlert, U.; Heinrich, M. Effects of different kinds of couple interaction on cortisol and heart rate responses to stress in women. Psychoneuroendocrinology 2007, 32, 565–574. [Google Scholar] [CrossRef] [PubMed]
  44. Crusco, A.H.; Wetzel, C.G. The Midas touch: The effects of interpersonal touch on restaurant tipping. Pers. Soc. Psychol. Bull. 1984, 10, 512–517. [Google Scholar] [CrossRef]
  45. Fisher, J.D.; Rytting, M.; Heslin, R. Hands touching hands: Affective and evaluative effects of interpersonal touch. Sociometry 1976, 39, 416–421. [Google Scholar] [CrossRef]
  46. Guéguen, N.; Fischer-Lokou, J. An evaluation of touch on a large request: A field setting. Psychol. Rep. 2002, 90, 267–269. [Google Scholar] [CrossRef]
  47. Legg, A.A.; Wilson, J.H. Instructor touch enhanced college students’ evaluations. Soc. Psychol. Educ. 2013, 16, 317–327. [Google Scholar] [CrossRef]
  48. Major, B.; Schmidlin, A.M.; Williams, L. Gender patterns in social touch: The impact of setting and age. J. Pers. Soc. Psychol. 1990, 58, 634–643. [Google Scholar] [CrossRef] [PubMed]
  49. Henley, N.M. Status and sex: Some touching observations. Bull. Psychon. Soc. 1973, 2, 91–93. [Google Scholar] [CrossRef] [Green Version]
  50. DiBiase, R.; Gunnoe, J. Gender and Culture Differences in Touching Behavior. J. Soc. Psychol. 2004, 144, 49–62. [Google Scholar] [CrossRef] [PubMed]
  51. Hall, J.A.; Veccia, E.M. More ‘touching’ observations: New insights on men, women, and interpersonal touch. J. Pers. Soc. Psychol. 1990, 59, 1155–1162. [Google Scholar] [CrossRef]
  52. McDaniel, E.; Andersen, P.A. International patterns of interpersonal tactile communication: A field study. J. Nonverbal Behav. 1998, 22, 59–75. [Google Scholar] [CrossRef]
  53. Shibata, T.; Mitsui, T.; Wada, K.; Touda, A.; Kumasaka, T.; Tagami, K.; Tanie, K. Mental commit robot and its application to therapy of children. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Como, Italy, 8–12 July 2001; pp. 1053–1058. [Google Scholar] [CrossRef]
  54. Stiehl, W.D.; Lieberman, J.; Breazeal, C.; Basel, L.; Lalla, L.; Wolf, M. Design of a therapeutic robotic companion for relational, affective touch. In Proceedings of the IEEE International Workshop Robot and Human Interactive Communication, Nashville, TN, USA, 13–15 August 2005; pp. 408–415. [Google Scholar] [CrossRef]
  55. Yohanan, S.; MacLean, K.E. The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature. Int. J. Soc. Robot. 2012, 4, 163–180. [Google Scholar] [CrossRef]
  56. Knight, H.; Toscano, R.; Stiehl, W.D.; Chang, A.; Wang, Y.; Breazeal, C. Real-time social touch gesture recognition for sensate robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems IROS 2009, St. Louis, MO, USA, 10–15 October 2009; pp. 3715–3720. [Google Scholar] [CrossRef]
  57. Noda, T.; Miyashita, T.; Ishiguro, H.; Hagita, N. Super-flexible skin sensors embedded on the whole body self-organizing based on haptic interactions. In Human-Robot Interaction in Social Robotics; Kanda, T., Ishiguro, H., Eds.; CRC Press: Boca Raton, FL, USA, 2013; pp. 183–202. ISBN 13:978-1-4665-0698-5. [Google Scholar]
  58. Cooney, M.D.; Nishio, S.; Ishiguro, H. Importance of touch for conveying affection in a multimodal interaction with a small humanoid robot. Int. J. Humanoid Robot. 2015, 12. [Google Scholar] [CrossRef]
  59. Cooney, M.D.; Nishio, S.; Ishiguro, H. Recognizing affection for a touch-based interaction with a humanoid robot. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems IROS, Vilamoura, Portugal, 7–11 October 2012; pp. 1420–1427. [Google Scholar] [CrossRef]
  60. Lowe, R.; Andreasson, R.; Alenljung, B.; Lund, A.; Billing, E. Designing for a wearable affective interface for the NAO robot: A study of emotion conveyance by touch. Multimodal Technol. Interact. 2018, 2, 2. [Google Scholar] [CrossRef]
  61. Bartneck, C.; Kulić, D.; Croft, E.; Zoghbi, S. Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int. J. Soc. Robot. 2009, 1, 71–81. [Google Scholar] [CrossRef]
  62. Nomura, T.; Kanda, T.; Suzuki, T. Experimental investigation into influence of negative attitudes towards robots on human-robot interaction. AI Soc. 2006, 20, 138–150. [Google Scholar] [CrossRef]
  63. Thomas, D.R. A general inductive approach for analyzing qualitative evaluation data. Am. J. Eval. 2006, 27, 237–246. [Google Scholar] [CrossRef]
  64. Patton, M.Q. Qualitative Research & Evaluation Methods: Integrating Theory and Practice, 4th ed.; SAGE Publications: Thousand Oaks, CA, USA, 2015; ISBN 9781412972123. [Google Scholar]
  65. Strauss, A.; Corbin, J. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, 2nd ed.; SAGE Publication: Thousand Oaks, CA, USA, 1998; ISBN 0-8039-5939-7. [Google Scholar]
  66. Alenljung, B.; Lindblom, J.; Andreasson, R.; Ziemke, T. User experience in social human-robot interaction. Int. J. Ambient Comput. Intell. 2017, 8, 12–31. [Google Scholar] [CrossRef]
  67. Young, J.E.; Sung, J.Y.; Voida, A.; Sharlin, E.; Igarashi, T.; Christensen, H.I.; Grinter, R.E. Evaluating human-robot interaction: Focusing on the holistic interaction experience. Int. J. Soc. Robot. 2011, 3, 53–67. [Google Scholar] [CrossRef]
  68. Weiss, A.; Bernhaupt, R.; Tscheligi, M. The USUS evaluation framework for user-centered HRI. In New Frontiers in Human-Robot Interaction; Dautenhahn, K., Saunders, J., Eds.; John Benjamins: Amsterdam, The Netherlands, 2011; pp. 89–110. ISBN 978 90 272 0455 4. [Google Scholar]
  69. Jourard, S.M. An exploratory study of body-accessibility. Br. J. Soc. Clin. Psychol. 1966, 5, 221–231. [Google Scholar] [CrossRef] [PubMed]
  70. Tay, B.; Jung, Y.; Park, T. When stereotypes meet robots: The double-edge sword of robot gender and personality in human–robot interaction. Comput. Hum. Behav. 2014, 38, 75–84. [Google Scholar] [CrossRef]
  71. Brščić, D.; Kidokoro, H.; Suehiro, Y.; Kanda, T. Escaping from Children’s Abuse of Social Robots. In Proceedings of the 10th ACM/IEEE International Conference on Human-Robot Interaction HRI, Portland, OR, USA, 3–5 March 2015; pp. 59–66. [Google Scholar] [CrossRef]
  72. Salvini, P.; Ciaravella, G.; Yu, W.; Ferri, G.; Manzi, A.; Mazzolai, B.; Laschi, C.; Oh, S.R.; Dario, P. How safe are service robots in urban environments? Bullying a robot. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication, Viareggio, Italy, 13–15 September 2010; pp. 1–7. [Google Scholar] [CrossRef]
  73. Nomura, T.; Kanda, T.; Kidokoro, H.; Suehiro, Y.; Yamada, S. Why do children abuse robots? Interact. Stud. 2016, 17, 347–369. [Google Scholar] [CrossRef]
  74. Lindblom, J. Embodied Social Interaction, Cognitive Systems Monographs (COSMOS); Springer: Berlin, Germany, 2015; Volume 26, ISBN 978-3-319-20315-7. [Google Scholar]
  75. Preece, J.; Rogers, Y.; Sharp, H. Interaction Design: Beyond Human-Computer Interaction, 4th ed.; John Wiley & Sons: Chichester, UK, 2015; ISBN 978-0-470-66576-3. [Google Scholar]
  76. Fukuda, H.; Shiomi, M.; Nakagawa, K.; Ueda, K. ‘Midas touch’ in human-robot interaction: Evidence from event-related potentials during the ultimatum game. In Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction HRI’12, Boston, MA, USA, 5–8 March 2012; pp. 131–132. [Google Scholar] [CrossRef]
  77. Nakagawa, K.; Shiomi, M.; Shinozawa, K.; Matsumura, R.; Ishiguro, H.; Hagita, N. Effect of robot’s active touch on people’s motivation. In Proceedings of the 6th International Conference on Human-Robot Interaction HRI’11, Lausanne, The Switzerland, 6–9 March 2011; pp. 465–472. [Google Scholar] [CrossRef]
  78. Alesich, S.; Rigby, M. Gendered robots: Implications for our humanoid future. IEEE Technol. Soc. Mag. 2017, 36, 50–59. [Google Scholar] [CrossRef]
  79. Geraci, R.M. Robots and the sacred in science and science fiction: Theological implications of artificial intelligence. Zygon 2007, 42, 961–980. [Google Scholar] [CrossRef]
Figure 1. Experimental set-up where the participant interacts with the Nao in the Usability Lab. The participant interacts with the Nao by touching left and right arms to convey a particular emotion. Camera shots are supplied by the ELAN annotation tool: https://tla.mpi.nl/tools/tla-tools/elan/.
Figure 1. Experimental set-up where the participant interacts with the Nao in the Usability Lab. The participant interacts with the Nao by touching left and right arms to convey a particular emotion. Camera shots are supplied by the ELAN annotation tool: https://tla.mpi.nl/tools/tla-tools/elan/.
Mti 02 00082 g001
Figure 2. Aspects contributing to the unsafe experience.
Figure 2. Aspects contributing to the unsafe experience.
Mti 02 00082 g002
Figure 3. Aspects contributing to the odd experience.
Figure 3. Aspects contributing to the odd experience.
Mti 02 00082 g003
Figure 4. Aspects contributing to the difficult experience.
Figure 4. Aspects contributing to the difficult experience.
Mti 02 00082 g004
Figure 5. Aspects contributing to the interesting and fun experience.
Figure 5. Aspects contributing to the interesting and fun experience.
Mti 02 00082 g005
Figure 6. Overview of the UX dimensions.
Figure 6. Overview of the UX dimensions.
Mti 02 00082 g006
Figure 7. Relation between the aspect ‘To show emotions’ and UX dimension.
Figure 7. Relation between the aspect ‘To show emotions’ and UX dimension.
Mti 02 00082 g007
Figure 8. Relation between the aspect ‘To touch others’ and UX dimensions.
Figure 8. Relation between the aspect ‘To touch others’ and UX dimensions.
Mti 02 00082 g008
Figure 9. Relation between the aspect ‘Predisposal to robots’’ and UX in general.
Figure 9. Relation between the aspect ‘Predisposal to robots’’ and UX in general.
Mti 02 00082 g009
Figure 10. Relation between the aspect ‘Point of reference: human being’ and touch behavior.
Figure 10. Relation between the aspect ‘Point of reference: human being’ and touch behavior.
Mti 02 00082 g010
Figure 11. Relation between the aspect ‘Relationship’, UX dimensions, and touch behavior.
Figure 11. Relation between the aspect ‘Relationship’, UX dimensions, and touch behavior.
Mti 02 00082 g011
Figure 12. Relation between the aspect ‘Point of reference: softness’ and touch behavior.
Figure 12. Relation between the aspect ‘Point of reference: softness’ and touch behavior.
Mti 02 00082 g012
Figure 13. Relation between the aspect ‘Robot gender’ and touch behavior.
Figure 13. Relation between the aspect ‘Robot gender’ and touch behavior.
Mti 02 00082 g013
Figure 14. Relation between the aspect ‘De facto a robot’ and UX dimensions.
Figure 14. Relation between the aspect ‘De facto a robot’ and UX dimensions.
Mti 02 00082 g014
Figure 15. Relation between the aspect ‘Size’, UX dimensions, and touch behavior.
Figure 15. Relation between the aspect ‘Size’, UX dimensions, and touch behavior.
Mti 02 00082 g015
Figure 16. Relation between the aspect ‘Surface’, UX dimensions, and touch behavior.
Figure 16. Relation between the aspect ‘Surface’, UX dimensions, and touch behavior.
Mti 02 00082 g016
Figure 17. Relation between the aspect ‘Fragility’, UX dimensions, and touch behavior.
Figure 17. Relation between the aspect ‘Fragility’, UX dimensions, and touch behavior.
Mti 02 00082 g017
Figure 18. Proportion of participants answering “difficult” or “very difficult” and “uncertain” or “very uncertain” over different emotions [22].
Figure 18. Proportion of participants answering “difficult” or “very difficult” and “uncertain” or “very uncertain” over different emotions [22].
Mti 02 00082 g018
Figure 19. Relation between the aspect ‘Type of emotion’, UX dimensions, and touch behavior.
Figure 19. Relation between the aspect ‘Type of emotion’, UX dimensions, and touch behavior.
Mti 02 00082 g019
Figure 20. Relation between the aspect ‘Response’, UX dimensions, and touch behavior.
Figure 20. Relation between the aspect ‘Response’, UX dimensions, and touch behavior.
Mti 02 00082 g020
Figure 21. Relation between the aspect ‘Modality’, UX dimensions, and touch behavior.
Figure 21. Relation between the aspect ‘Modality’, UX dimensions, and touch behavior.
Mti 02 00082 g021
Figure 22. Relation between the aspect ‘Encounters’ and UX dimensions.
Figure 22. Relation between the aspect ‘Encounters’ and UX dimensions.
Mti 02 00082 g022
Figure 23. Overview of relations between aspects, user experiences, and touch behaviors.
Figure 23. Overview of relations between aspects, user experiences, and touch behaviors.
Mti 02 00082 g023
Table 1. Overview of aspects affecting the user experience and touch behavior
Table 1. Overview of aspects affecting the user experience and touch behavior
Group of AspectsAspects Affecting User Experience and Touch BehaviorDimensions Along Which the Aspects Vary
Individual characteristicsTo show emotionsNon-emotion showing ↔ High degree of emotion showing
To touch othersDo not touch ↔ High degree of touching
Predisposal to robotsPositive ↔ Negative
Ways of thinkingPoint of reference: human beingNot human ↔ Human
Low attributed human age ↔ High attributed human age
RelationshipLow level of intimacy ↔ High level of intimacy
Unclear ↔ Clear
Point of reference: softnessDo not imagine softness ↔ Imagine softness
Robot genderAttribution of female gender ↔ Attribution of male gender
Robot characteristicsDe facto a robot
SizeSmall ↔ Large
SurfaceHard ↔ Soft
FragilityAppear as highly fragile ↔ Appear as not fragile
Task and interaction characteristicsType of emotionNegative emotions ↔ Positive emotions
Robot gender dependent ↔ Robot gender independent
ResponseLow degree of responsiveness ↔ High degree of responsiveness
ModalitySingle modal ↔ Multi modal
EncountersFirst time ↔ Many previous

Share and Cite

MDPI and ACS Style

Alenljung, B.; Andreasson, R.; Lowe, R.; Billing, E.; Lindblom, J. Conveying Emotions by Touch to the Nao Robot: A User Experience Perspective. Multimodal Technol. Interact. 2018, 2, 82. https://doi.org/10.3390/mti2040082

AMA Style

Alenljung B, Andreasson R, Lowe R, Billing E, Lindblom J. Conveying Emotions by Touch to the Nao Robot: A User Experience Perspective. Multimodal Technologies and Interaction. 2018; 2(4):82. https://doi.org/10.3390/mti2040082

Chicago/Turabian Style

Alenljung, Beatrice, Rebecca Andreasson, Robert Lowe, Erik Billing, and Jessica Lindblom. 2018. "Conveying Emotions by Touch to the Nao Robot: A User Experience Perspective" Multimodal Technologies and Interaction 2, no. 4: 82. https://doi.org/10.3390/mti2040082

Article Metrics

Back to TopTop