An Emotional Design Model for Future Smart Product Based on Grounded Theory
Abstract
:1. Introduction
2. Literature Review
2.1. Examples of Emotional Design for Smart Products
2.2. Various Perspectives on the Emotional Design for Smart Products
2.3. Models Related to Smart Product Design
2.4. Selection and Comparison of Research Methods
3. Research Methodology
3.1. Study Design
3.2. Research Participants
3.3. Data Collection
3.3.1. Product Introduction and Users’ Comments
3.3.2. Interviews
3.4. Data Analysis
3.4.1. Substantive Coding
3.4.2. Theoretical Coding
3.4.3. Theoretical Saturation Test
4. Results
4.1. Raw Data and Database
4.2. Coding Result
4.3. Category Definition and Model Presentation
4.4. Theoretical Saturation Test
4.5. Selective Coding Interpretation
- User’s Emotional Needs: It involves user and scenario analysis. Examples include the selection of target users and target scenarios as well as preferences derived from user research, such as emotional expectations. Moreover, there are concerns regarding special needs based on socio-cultural contexts, etc. Based on this category, real emotional needs of users are explained [72] and the core issues for emotional design are defined [47].
- Concepts for Emotional Design: The design team identifies the core design concept by combining other information, such as cultural concepts and business strategies [73]. As an example, different design teams may propose concepts such as “emotion-relevant personalization” and “minimalism” for smart home products. It is important to define these design concepts in conjunction with social values, design ethics [74], business strategies, and design visions [75] which can be used to guide the design process in a variety of ways.
- Role Construction: When brainstorming ideas for the emotional design of a smart product [76], designers often consider it to be a living thing, with role construction playing an important role. The concept of roles can be viewed in terms of occupational roles, interpersonal roles, and product roles, such as “doctor”, “friend”, “toy”, etc. According to the theory of role construction in social psychology, people subconsciously presuppose each other’s appearances and personalities when they interact. It is also possible to predict the behavior of others using role constructs. Therefore, defining the appropriate role for a product can help a design team in generating more specific design ideas.
- Character and Behavior: For design teams to consider smart products as living beings for creative design, it is also important to consider their character and behavior. Designers often create personality types for smart products by imagining “behavioral habits”, “social attributes”, “anthropomorphic traits”, etc. [7,77,78]. In conjunction with the actual functionality and technical capabilities, designers can express the personality and behavior of a product in various ways.
- Emotionally Relevant Functions: Design teams often start the design from a specific functionality perspective rather than considering the experiential aspects. Two aspects determine the functionalities of a smart product: the features set by the smart product itself and the updated features derived from analysis of the emotional needs of the users. As an example, for smart speakers, the information quiz is an existing feature, whereas the child mode dialogue is a new feature designed specifically for children.
- Design Elements: When faced with a specific product function, design teams should consider the selection of appropriate design elements [9,79]. “Software”, “hardware”, “appearance”, “materials”, etc., are all design elements necessary to achieve product function. Additionally, it is essential to consider the “shape”, “color”, “digital content”, “size”, etc. of these design elements.
- Technical Capabilities: Technical capabilities are equally significant as design elements since the emotionality of smart products is enhanced by artificial intelligence technologies [80], such as “context awareness”, “emotion recognition”, “motion control” [81], etc. To ensure that the emotion of the smart product can be successfully expressed by the product and experienced by the user, the design team should consider the technical capabilities required to implement the function.
- Use and Configuration: The design team must decide how users interact with the product after defining design elements and technical capabilities [82,83]. An interaction between the user and the smart product can be defined by concepts such as “interaction methods”, “operation methods”, “hardware and software configurations”, “multi-end collaboration”, and “supporting services”, which enable the user to interact with the smart product and have an experience [11].
- Product Evaluation: A product evaluation can be identified after a product or prototype has been released and tested by real users [84], such as “ease of use”, “proactivity”, “inclusiveness” [85], etc. The design team can reflect on and further improve the product based on the objective evaluations derived from the experiments after the smart product has been developed.
- User Emotional Experience: In addition to objective evaluations, users also have subjective emotional experiences with smart products [78,86]. For example, users may develop a certain level of attachment and closeness to the product or have anthropomorphic interpretations of the product. Most of these emotional experiences can be used as a useful reference for improving product design in terms of emotion-relevant personalization and emotion-relevant customization.
4.6. Explanation of Theoretical Sampling
4.6.1. “User’s Emotional Needs” to “Concepts for Emotional Design”
“If the product looks like a real animal, it would be a bit creepy, and I’d like the appearance to be more abstract than realistic”.
4.6.2. Emotional Attributes of Smart Products
“For many years, it’s been your loyal friend and is constantly updated with interesting content you need”.
4.6.3. The Practice of Smart Products Emotional Design
“The haptic sensor gives Ollie the ability to ‘sense contact’ and respond to touch”.
4.6.4. Experience and Evaluation
“QTrobot acts in a consistent and predictable way to prevent children from feeling overwhelmed”.
5. Discussion
5.1. Comparison with Existing Models
5.2. How to Apply This Model in Smart Product Design
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Ma, X. Towards Human-Engaged AI. In Proceedings of the 27th International Joint Conference on Artificial Intelligence, Stockholm, Sweden, 13–19 July 2018; pp. 5682–5686. [Google Scholar]
- Chapman, J. Emotionally Durable Design; Routledge: Oxfordshire, UK, 2012; ISBN 978-1-136-56743-8. [Google Scholar]
- Brave, S.; Nass, C.; Hutchinson, K. Computers That Care: Investigating the Effects of Orientation of Emotion Exhibited by an Embodied Computer Agent. Int. J. Hum.-Comput. Stud. 2005, 62, 161–178. [Google Scholar] [CrossRef]
- Paschkewitz, J.; Patt, D. Can AI Make Your Job More Interesting? Issues Sci. Technol. 2020, 37, 74–78. [Google Scholar]
- Dalvandi, B. A Model of Empathy for Artificial Agent Teamwork. Ph.D. Thesis, University of Northern British Columbia, Prince George, BC, Canada, 2013. [Google Scholar]
- Luca, J.; Tarricone, P. Does Emotional Intelligence Affect Successful Teamwork? In Proceedings of the 18th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education, Melbourne, Australia, 9 December 2001; pp. 367–376. [Google Scholar]
- Fzal, S.; Dempsey, B.; D’Helon, C.; Mukhi, N.; Pribic, M.; Sickler, A.; Strong, P.; Vanchiswar, M.; Wilde, L. The Personality of AI Systems in Education: Experiences with the Watson Tutor, a One-on-One Virtual Tutoring System. Child. Educ. 2019, 95, 44–52. [Google Scholar] [CrossRef]
- Norman, D.A. Emotional Design: Why We Love (or Hate) Everyday Things; Basic Books: New York, NY, USA, 2004; ISBN 978-0-465-00417-1. [Google Scholar]
- Zabala, U.; Rodriguez, I.; Martínez-Otzeta, J.M.; Lazkano, E. Expressing Robot Personality through Talking Body Language. Appl. Sci. 2021, 11, 4639. [Google Scholar] [CrossRef]
- Nijdam, N.A. Mapping Emotion to Color. In Book Mapping Emotion to Color; University of Twente: Enschede, The Netherlands, 2009; pp. 2–9. [Google Scholar]
- Ma, J.; Feng, X.; Gong, Z.; Zhang, Q. The Design Definition and Research of In-Car Digital AI Assistant. J. Phys. Conf. Ser. 2021, 1802, 032096. [Google Scholar] [CrossRef]
- Liu, L.; Zhang, A.; Zhang, L.; Xu, J. Research on Emotional Design of Intelligent Sleep Products for the Elderly Based on Kano Model and Customer Satisfaction and Dissatisfaction Coefficients. In Proceedings of the 2021 2nd International Conference on Intelligent Design (ICID), Xi’an, China, 19 October 2021; pp. 528–531. [Google Scholar]
- Johnson, D.O.; Cuijpers, R.H.; van der Pol, D. Imitating Human Emotions with Artificial Facial Expressions. Int. J. Soc. Robot. 2013, 5, 503–513. [Google Scholar] [CrossRef]
- Desmet, P.; Hekkert, P. Framework of Product Experience. Int. J. Des. 2007, 1, 57–66. [Google Scholar]
- Norman, D.A.; Ortony, A. Designers and users: Two perspectives on emotion and design. In Proceedings of the Symposium on Foundations of Interaction Design, Ivrea, Italy, 12−13 November 2003; pp. 1–13. [Google Scholar]
- Lin, R.T. Transforming Taiwan aboriginal cultural features into modern product design: A case study of a cross-cultural product design model. Int. J. Des. 2007, 1, 45–53. [Google Scholar]
- Scherer, K.R. What are emotions? And how can they be measured? Soc. Sci. Inf. 2005, 44, 695–729. [Google Scholar] [CrossRef]
- Sandygulova, A.; O’Hare, G.M.P. Children’s Perception of Synthesized Voice: Robot’s Gender, Age and Accent. In Social Robotics; Tapus, A., André, E., Martin, J.-C., Ferland, F., Ammi, M., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2015; Volume 9388, pp. 594–602. ISBN 978-3-319-25553-8. [Google Scholar]
- Song, S.; Baba, J.; Nakanishi, J.; Yoshikawa, Y.; Ishiguro, H. Mind the Voice!: Effect of Robot Voice Pitch, Robot Voice Gender, and User Gender on User Perception of Teleoperated Robots. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25 April 2020; pp. 1–8. [Google Scholar]
- Lottridge, D.; Chignell, M.; Jovicic, A. Affective Interaction: Understanding, Evaluating, and Designing for Human Emotion. Rev. Hum. Factors Ergon. 2011, 7, 197–217. [Google Scholar] [CrossRef]
- Asada, M. Towards Artificial Empathy. Int. J. Soc. Robot. 2015, 7, 19–33. [Google Scholar] [CrossRef] [Green Version]
- Li, X.; Cai, S. Emotional Design for Intelligent Products Using Artificial Intelligence Technology. In Proceedings of the 2021 2nd International Conference on Intelligent Design (ICID), Xi’an, China, 19 October 2021; pp. 260–263. [Google Scholar]
- Zuo, X.; Yu, X.; Du, M.; Song, Q. Generating Consistent Multimodal Dialogue Responses with Emoji Context Model. In Proceedings of the 2022 5th International Conference on Artificial Intelligence and Big Data (ICAIBD), Chengdu, China, 27 May 2022; pp. 617–624. [Google Scholar]
- Beattie, A.; Edwards, A.P.; Edwards, C. A Bot and a Smile: Interpersonal Impressions of Chatbots and Humans Using Emoji in Computer-Mediated Communication. Commun. Stud. 2020, 71, 409–427. [Google Scholar] [CrossRef]
- Fadhil, A.; Schiavo, G.; Wang, Y.; Yilma, B.A. The Effect of Emojis When Interacting with Conversational Interface Assisted Health Coaching System. In Proceedings of the 12th EAI International Conference on Pervasive Computing Technologies for Healthcare, New York, NY, USA, 21 May 2018; pp. 378–383. [Google Scholar]
- Rogers, K.; Bryant, D.; Howard, A. Robot Gendering: Influences on Trust, Occupational Competency, and Preference of Robot over Human. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25 April 2020; pp. 1–7. [Google Scholar]
- Ye, H.; Jeong, H.; Zhong, W.; Bhatt, S.; Izzetoglu, K.; Ayaz, H.; Suri, R. The Effect of Anthropomorphization and Gender of a Robot on Human-Robot Interactions. In Advances in Neuroergonomics and Cognitive Engineering; Ayaz, H., Ed.; Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2020; Volume 953, pp. 357–362. ISBN 978-3-030-20472-3. [Google Scholar]
- Yu, C.; Fu, C.; Chen, R.; Tapus, A. First Attempt of Gender-Free Speech Style Transfer for Genderless Robot. In Proceedings of the 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Sapporo, Japan, 7 March 2022; pp. 1110–1113. [Google Scholar]
- Eyssel, F.; Kuchenbrandt, D.; Bobinger, S. ‘If You Sound Like Me, You Must Be More Human’: On the Interplay of Robot and User Features on Human—Robot Acceptance and Anthropomorphism. In Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI ’12), New York, NY, USA, 5–8 March 2012; pp. 125–126. [Google Scholar]
- Ariyoshi, T.; Nakadai, K.; Tsujino, H. Effect of Facial Colors on Humanoids in Emotion Recognition Using Speech. In Proceedings of the RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759), Kurashiki, Japan; 2004; pp. 59–64. [Google Scholar]
- Kim, M.G.; Lee, H.S.; Park, J.W.; Jo, S.H.; Chung, M.J. Determining Color and Blinking to Support Facial Expression of a Robot for Conveying Emotional Intensity. In Proceedings of the RO-MAN 2008—The 17th IEEE International Symposium on Robot and Human Interactive Communication, Munich, Germany, 1–3 August 2008; pp. 219–224. [Google Scholar]
- Duffy, B.R. Anthropomorphism and the Social Robot. Robot. Auton. Syst. 2003, 42, 177–190. [Google Scholar] [CrossRef]
- de Visser, E.J.; Monfort, S.S.; McKendrick, R.; Smith, M.A.B.; McKnight, P.E.; Krueger, F.; Parasuraman, R. Almost Human: Anthropomorphism Increases Trust Resilience in Cognitive Agents. J. Exp. Psychol. Appl. 2016, 22, 331–349. [Google Scholar] [CrossRef] [PubMed]
- Baraka, K.; Rosenthal, S.; Veloso, M. Enhancing Human Understanding of a Mobile Robot’s State and Actions Using Expressive Lights. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 652–657. [Google Scholar]
- Moflin. Available online: https://www.moflin.com (accessed on 5 January 2023).
- Emo Robot. Available online: https://comingsoon.higizmos.com/emo (accessed on 5 January 2023).
- MarsCat: A Bionic Cat, a Home Robot|Elephant Robotics. Available online: https://www.elephantrobotics.com/en/mars-en/ (accessed on 5 January 2023).
- Sony Aibo. Available online: https://us.aibo.com/ (accessed on 5 January 2023).
- Marsella, S.; Gratch, J. Modeling Coping Behavior in Virtual Humans: Don’t Worry, Be Happy. In Proceedings of the Second International Joint Conference on Autonomous Agents and Multiagent Systems—AAMAS ’03, Melbourne, Australia, 14–18 July 2003; p. 313. [Google Scholar]
- Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA; London, UK, 1997; ISBN 0-262-16170-2. [Google Scholar]
- Fogg, B.J. Persuasive Technology: Using Computers to Change What We Think and Do. Ubiquity 2002, 2002, 5. [Google Scholar] [CrossRef] [Green Version]
- Design Council a Study of the Design Process. Available online: https://www.designcouncil.org.uk/our-work/skills-learning/resources/11-lessons-managing-design-global-brands/ (accessed on 31 December 2022).
- Stanford University Design Thinking Bootleg. Available online: https://dschool.stanford.edu/resources/design-thinking-bootleg (accessed on 31 December 2022).
- IDEO Tools. Available online: https://www.ideo.org/tools (accessed on 5 January 2023).
- Francalanza, E.; Borg, J.; Fenech, A.; Farrugia, P. Emotional Product Design: Merging Industrial and Engineering Design Perspectives. Procedia CIRP 2019, 84, 124–129. [Google Scholar] [CrossRef]
- Zhao, T.; Zhu, T. Exploration of Product Design Emotion Based on Three-Level Theory of Emotional Design. In Human Interaction and Emerging Technologies; Ahram, T., Taiar, R., Colson, S., Choplin, A., Eds.; Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2020; Volume 1018, pp. 169–175. ISBN 978-3-030-25628-9. [Google Scholar]
- Boisseau, E.; Bouchard, C.; Omhover, J. Towards a Model of the Open-Design Process: Using the Grounded Theory for Modelling Implicit Design Processes; Maier, A., Skec, S., Kim, H., Kokkolaras, M., Oehmen, J., Fadel, G., Salustri, F., VanDerLoos, M., Eds.; The Design Society: Vancouver, BC, Canada, 2017; pp. 121–130. [Google Scholar]
- Wang, Z.; He, W.P.; Zhang, D.H.; Cai, H.M.; Yu, S.H. Creative Design Research of Product Appearance Based on Human–Machine Interaction and Interface. J. Mater. Process. Technol. 2002, 129, 545–550. [Google Scholar] [CrossRef]
- Yi, Y.; Li, Y. Study on Grounded Theory-Based Product Design. In Proceedings of the 2021 3rd International Conference on Artificial Intelligence and Advanced Manufacture, Manchester, UK, 23 October 2021; pp. 42–46. [Google Scholar]
- Lu, R.; Feng, Y.; Zheng, H.; Tan, J. A Product Design Based on Interaction Design and Axiomatic Design Theory. Procedia CIRP 2016, 53, 125–129. [Google Scholar] [CrossRef] [Green Version]
- Niu, X.; Wang, M.; Qin, S. Product Design Lifecycle Information Model (PDLIM). Int. J. Adv. Manuf. Technol. 2022, 118, 2311–2337. [Google Scholar] [CrossRef]
- Tao, F.; Cheng, J.; Qi, Q.; Zhang, M.; Zhang, H.; Sui, F. Digital Twin-Driven Product Design, Manufacturing and Service with Big Data. Int. J. Adv. Manuf. Technol. 2018, 94, 3563–3576. [Google Scholar] [CrossRef]
- Kiritsis, D. Closed-Loop PLM for Intelligent Products in the Era of the Internet of Things. Comput.-Aided Des. 2011, 43, 479–501. [Google Scholar] [CrossRef]
- Landowska, A.; Szwoch, M.; Szwoch, W. Methodology of Affective Intervention Design for Intelligent Systems. Interact. Comput. 2016, 28, 737–759. [Google Scholar] [CrossRef]
- Hong, L.; Luo, L. Kansei Engineering Design; Tsinghua University Press: Beijing, China, 2015; ISBN 978-7-302-41213-7. [Google Scholar]
- Zimmerman, J.; Forlizzi, J.; Evenson, S. Research through Design as a Method for Interaction Design Research in HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 29 April 2007; pp. 493–502. [Google Scholar]
- Ciesielska, M.; Boström, K.W.; Öhlander, M. Observation Methods. In Qualitative Methodologies in Organization Studies; Ciesielska, M., Jemielniak, D., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 33–52. ISBN 978-3-319-65441-6. [Google Scholar]
- Wojnar, D.M.; Swanson, K.M. Phenomenology: An Exploration. J. Holist. Nurs. 2007, 25, 172–180. [Google Scholar] [CrossRef] [Green Version]
- Glaser, B.; Strauss, A. Discovery of Grounded Theory: Strategies for Qualitative Research; Routledge: New York, NY, USA, 2017; ISBN 978-0-203-79320-6. [Google Scholar]
- Smith, R.P.; Morrow, J.A. Product Development Process Modeling. Des. Stud. 1999, 20, 237–261. [Google Scholar] [CrossRef]
- Paillé, P. L’analyse Par Théorisation Ancrée. CRS 1994, 23, 147–181. [Google Scholar] [CrossRef] [Green Version]
- Corbin, J.; Strauss, A. Grounded Theory Research: Procedures, Canons, and Evaluative Criteria. Qual. Sociol. 1990, 13, 3–21. [Google Scholar] [CrossRef]
- Rihacek, T.; Danelova, E. The Journey of an Integrationist: A Grounded Theory Analysis. Psychotherapy 2016, 53, 78–89. [Google Scholar] [CrossRef]
- Patton, M.Q. Qualitative Research & Evaluation Methods; SAGE: Thousand Oaks, CA, USA, 2002; ISBN 978-0-7619-1971-1. [Google Scholar]
- Chen, X. Qualitative Research in Social Sciences; Educational Science Publication House: Beijing, China, 2000; ISBN 978-7-5041-1926-1. [Google Scholar]
- Farquhar, J.; Michels, N.; Robson, J. Triangulation in Industrial Qualitative Case Study Research: Widening the Scope. Ind. Mark. Manag. 2020, 87, 160–170. [Google Scholar] [CrossRef]
- Gubrium, J.F.; Holstein, J.A. Handbook of Interview Research: Context and Method; SAGE Publications: Thousand Oaks, CA, USA, 2001; ISBN 978-1-4833-6589-3. [Google Scholar]
- Hill, C.E.; Thompson, B.J.; Williams, E.N. A Guide to Conducting Consensual Qualitative Research. Couns. Psychol. 1997, 25, 517–572. [Google Scholar] [CrossRef] [Green Version]
- Brown, T. Design Thinking. Harv. Bus. Rev. 2008, 86, 84. [Google Scholar]
- Coughlan, P.; Suri, J.F.; Canales, K. Prototypes as (Design) Tools for Behavioral and Organizational Change: A Design-Based Approach to Help Organizations Change Work Behaviors. J. Appl. Behav. Sci. 2007, 43, 122–134. [Google Scholar] [CrossRef] [Green Version]
- Charmaz, K. A Constructivist Grounded Theory Analysis of Losing and Regaining a Valued Self. In Five Ways of Doing Qualitative Analysis: Phenomenological Psychology, Grounded Theory, Discourse Analysis, Narrative Research, and Intuitive Inquiry; Guilford Press: New York, NY, USA, 2011; pp. 165–204. ISBN 978-1-60918-142-0. [Google Scholar]
- Black, A. Empathic Design: User Focused Strategies for Innovation. Proc. New Prod. Dev. 1998. [Google Scholar]
- Wormald, P. Value Proposition for Designers—VP(d): A Tool for Strategic Innovation in New Product Development. Int. J. Bus. Environ. 2015, 7, 262. [Google Scholar] [CrossRef]
- Damm, L. Moral Machines: Teaching Robots Right from Wrong. Philos. Psychol. 2012, 25, 149–153. [Google Scholar] [CrossRef]
- d’Anjou, P. Toward an Horizon in Design Ethics. Sci. Eng. Ethics 2010, 16, 355–370. [Google Scholar] [CrossRef]
- Zong, Y.; GuangXin, W. Anthropomorphism: The Psychological Application in the Interaction between Human and Computer. Psychol. Tech. Appl. 2016, 4, 296–305. [Google Scholar] [CrossRef]
- LuxAI QT Robot. Available online: https://luxai.com/ (accessed on 31 October 2022).
- Zhou, M.X.; Mark, G.; Li, J.; Yang, H. Trusting Virtual Agents: The Effect of Personality. ACM Trans. Interact. Intell. Syst. 2019, 9, 10. [Google Scholar] [CrossRef]
- Feldmaier, J.; Marmat, T.; Kuhn, J.; Diepold, K. Evaluation of a RGB-LED-Based Emotion Display for Affective Agents. arXiv 2016, arXiv:1612.07303. [Google Scholar] [CrossRef]
- Hegel, F.; Spexard, T.; Wrede, B.; Horstmann, G.; Vogt, T. Playing a Different Imitation Game: Interaction with an Empathic Android Robot. In Proceedings of the 2006 6th IEEE-RAS International Conference on Humanoid Robots, Genova, Italy, 4–6 December 2006; pp. 56–61. [Google Scholar]
- Emi, T.; Hagiwara, M. Pose Generation System Expressing Feelings and State. Int. J. Affect. Eng. 2014, 13, 175–184. [Google Scholar] [CrossRef] [Green Version]
- Dragan, A.D.; Lee, K.C.T.; Srinivasa, S.S. Legibility and Predictability of Robot Motion. In Proceedings of the 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Tokyo, Japan, 3–6 March 2013; pp. 301–308. [Google Scholar]
- Rowland, C.; Goodman, E.; Charlier, M.; Light, A.; Lui, A. Designing Connected Products: UX for the Consumer Internet of Things; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2015; ISBN 1-4493-7272-4. [Google Scholar]
- Hollingsed, T.; Novick, D.G. Usability Inspection Methods after 15 Years of Research and Practice. In Proceedings of the 25th annual ACM International Conference on Design of Communication—SIGDOC ’07, El Paso, Texas, USA, 22–24 October 2007; p. 249. [Google Scholar]
- Motti, V.G.; Caine, K. Human Factors Considerations in the Design of Wearable Devices. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2014, 58, 1820–1824. [Google Scholar] [CrossRef]
- Dang, J.; Liu, L. Robots Are Friends as Well as Foes: Ambivalent Attitudes toward Mindful and Mindless AI Robots in the United States and China. Comput. Hum. Behav. 2021, 115, 106612. [Google Scholar] [CrossRef]
- Dandavate, U.; Sanders, E.B.-N.; Stuart, S. Emotions Matter: User Empathy in the Product Development Process. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 1996, 40, 415–418. [Google Scholar] [CrossRef]
- van Allen, P. Prototyping Ways of Prototyping AI. Interactions 2018, 25, 46–51. [Google Scholar] [CrossRef]
- Franzoi, S. Social Psychology, 5th ed; McGraw-Hill Humanities/Social Sciences/Languages: Boston, MA, USA, 2008; ISBN 978-0-07-337059-0. [Google Scholar]
- Yang, Q.; Steinfeld, A.; Rosé, C.; Zimmerman, J. Re-Examining Whether, Why, and How Human-AI Interaction Is Uniquely Difficult to Design. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 21 April 2020; pp. 1–13. [Google Scholar]
Selective Coding | Axial Coding (Opening Coding Numbers) | Sub-Total for Opening Codes |
---|---|---|
User’s Emotional Needs | Demand Expectations (4), Emotional Expectations (4), User Preference (4), Demand Adaptability (7), Scenario Adaptability (7), Group Category (4), Social Concern (5), Demand Category (6) | 41 |
Concepts for Emotional Design | The Interaction Concept (9), Design Vision (8), Design Concept (10), Design Ethics (6), Social Value (6), Business Strategy (6), Product Value (4), Pricing (2), Goal (9) | 60 |
Role Construction | Occupational Role (9), Interpersonal Role (9), Product Identity (5), Smart Assistant Type (7) | 30 |
Character and Behavior | Personality (8), Social Attributes (10), Emotional Characteristics (9), Anthropomorphic Behavior (14), Anthropomorphic Design Features (11), Anthropomorphic Design Attributes (8), Anthropomorphic Interaction Design (11) | 71 |
Emotionally Relevant Functions | Home Function (5), Companion Function (5), Life Function (3), Leisure Function (4), Entertainment Function (4), Office Function (4), Information Function (5), Education Function (8), Communication Function (2), Transportation Function (5), Security Function (4), Health Function (6), Content Generation Function (3), Function Ecology (7), Technology Application (12) | 77 |
Design Elements | Hardware (6), Software (5), Digital Content (5), Appearance Features (9), Appearance Size (1), Appearance Shape (5), Appearance Style (4), Material (7), Color Style (5), Design Direction (8) | 55 |
Technical Capabilities | Iterative Ability (3), Recognition Ability (7), Perception Ability (5), Understanding Ability (7), Motor Ability (5), Intelligence Ability (9), Data Ability (5), Product Technology (7), Interaction Technology (7) | 55 |
Use and Configuration | Software Attribute (2), Hardware Configuration (4), Hardware Attribute (9), Multi-Terminal Collaboration (9), Operation Method (5), Operation Setting (6), Interaction Method (10), Supporting Service (6) | 51 |
Product Evaluations | Inclusion (5), Limitation (3), Diversity (10), Ease of Use (9), Proactivity (10), Immediacy (2), Product Features (11), Emotion-Relevant Personalization (4) | 54 |
User Emotional Experience | User Experience (15), Product Impression (Subjective) (12), Product Texture Evaluation (4), Value Evaluation (4), Anthropomorphic Experience (5), User Attitude (4), Emotional Feeling After Use (9) | 53 |
Selective Coding | Explanation | Original Data Reviews (Axial Codes) |
---|---|---|
User’s Emotional Needs | The emotional requirements or desires that a user has in relation to a product or service. | I usually just stand in front of him and stop him, just to embarrass him. Wait to see his follow-up reaction after stopping it. (Emotional Expectations) The traditional Chinese pentatonic scale is used to design the sound effect system, making Duer more suitable for Chinese families. (Social Concerns) |
Concepts for Emotional Design | The principles or ideas that form the basis of smart product’s emotional design. | Through the minimalist animal-like design and colorful color matching, it awakens people’s imagination of freedom. (Design Concept) My grandfather is in a nursing home. I hope Tombot brings him solace, because he looks so much like the only Pomeranian he remembers: Stewie. (Social Value) |
Role Construction | The role of a smart product in human society. | An artificial intelligence singer based on AI deep learning algorithm to automatically generate music content. At present, there are many supernova singers such as Xiaobing, He Chang, Chen Shuiruo, Chen Ziyu, etc. (Occupational Role) MarsCat is the world’s first bionic pet cat developed by Elephant Robotics, a robot pet that brings you comfort and surprise. (Interpersonal Role) |
Character and Behavior | Personality traits and behaviors of a smart product. | The humorous personality that Siri was once proud of has also been overtaken by Google. (Personality) In a DIY teddy shop, there is a manual step is to put a heart in, and warm the heart before putting it in. (Anthropomorphic Design Features) |
Emotionally Relevant Functions | The specific functions of a smart product that are related to the emotional experience of the user. | It can also notify family members when an elderly user encounters an emergency and assist the elderly in video calls. (Health Function) Neons can help with goal-oriented tasks and also personalize tasks that require a human touch. (Entertainment Function) |
Design Elements | The physical or digital elements that can constitute the appearance, functionality, and behavior of smart products. | Romibo is also equipped with many sensors, including light sensors and acceleration sensors, which can control its trajectory so that the robot can automatically avoid obstacles in front of it. (Hardware) Among the round or square speaker shapes, Libratone’s smart Bluetooth speaker has successfully attracted attention with its cute bird appearance. (Appearance Features) |
Technical Capabilities | The specific skills, knowledge, or resources that are necessary to perform a task or function. | DuerOS has human language capabilities. It can understand human intentions and communicate with people in natural language. (Understanding Ability) Moxi follows orders and rules when system data reveals certain changes in patients. (Data Ability) |
Use and Configuration | How to use the smart product, and the process of preparing a product for use. | When you hold your iPhone close to HomePod mini, you can take immediate control without unlocking your iPhone. (Multi-terminal Collaboration) In terms of emotional interaction, the robot has set 41 types of expressions with dynamic design according to 24 emotions. (Interactive Method) |
Product Evaluations | The product evaluation of smart products derived from user experiments. | This is Apple’s longstanding commitment to diversity and inclusion. Products and services designed to better reflect the diversity of the world we live in. (Diversity, Inclusion) Proactively recommend the functions and information currently needed by users; proactively communicate with users to enhance understanding. (Proactivity) |
User Emotional Experience | The emotional experience of a user while interacting with a smart product or service. | Very cool, but I wish they had protections against malicious remote takeover. A robot might pick up a pistol and kill you in your sleep. (User Attitude) The sound of the broadcast is not blunt or coquettish, and it is comfortable to listen to. (Use Experience) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chao, C.; Chen, Y.; Wu, H.; Wu, W.; Yi, Z.; Xu, L.; Fu, Z. An Emotional Design Model for Future Smart Product Based on Grounded Theory. Systems 2023, 11, 377. https://doi.org/10.3390/systems11070377
Chao C, Chen Y, Wu H, Wu W, Yi Z, Xu L, Fu Z. An Emotional Design Model for Future Smart Product Based on Grounded Theory. Systems. 2023; 11(7):377. https://doi.org/10.3390/systems11070377
Chicago/Turabian StyleChao, Chiju, Yu Chen, Hongfei Wu, Wenxuan Wu, Zhijie Yi, Liang Xu, and Zhiyong Fu. 2023. "An Emotional Design Model for Future Smart Product Based on Grounded Theory" Systems 11, no. 7: 377. https://doi.org/10.3390/systems11070377