The Varieties of Agency in Human–Smart Device Relationships: The Four Agency Profiles
Abstract
:1. Introduction
- User agency, the self-perceived abilities to use a device.
- Device agency, the capacities the user attributes to a device.
1.1. The Role of Agency in Human–Technology Interactions: The Perspective of Affordances
1.2. A Further Look at Agency in the Literature of Information Systems (IS) and Human–Computer Interaction (HCI)
2. Materials and Methods
2.1. Toward Developing a Scale for Agency in Human and Smart Device Relationships
2.2. On Developing the Agency Survey
- Price utility: the perceived good use of money when purchasing the device [42].
2.3. Data Collection
3. Results
3.1. Exploratory Factorial Analysis
3.2. Tag Cloud Analysis
- Phase 1: the removal of punctuation and stop words;
- Phase 2: the removal of secondary stop words with the help of a domain expert.
- In the box of the controller (upper left), we see, firstly, a plethora of action verbs pop up, verbs such as make, work, need, know, watch, and learn, and, secondly, nouns such as tool, interaction, goals, user, time, sense, and function. There is a striking absence of any negative words. The words give an impression of function orientation and activity, which seem to fit with the profile of the controller as someone who attributes high agency to themselves while keeping their device as just a tool with low levels of independent agency.
- The box of the collaborator (upper right) has emphasis on communication- and interaction-oriented words, for example, need, feel, objects, human, situation, want, gathering, intrusive, missing, communication, helps, effects, disappointed, feedback, addict, and consent. This tag cloud segment seems to implicate a more personal relationship with the device and, notably, lacks the action and tool-oriented impression provided by the controller segment.
- In the segment of the victim (lower left), the words “Alexa” and “way” pop out first; however, further exploration reveals that the box contains many negatively loaded words: hate, error, try, stupid, avoid, yelling, and laziness. In the victim’s polar opposite segment, that of the controller, such negative expressions are absent, and the words related to actively performing an action that characterize the controller are not seen in this segment.
- In the detached segment (lower right), we see a multitude of nouns referring to the actual device the people are talking about, such as vacuum, keyboard, phone, function, solution, robot, and camera. Hence, the most important keywords are just about the technology and not about the actions and feelings of the user in interaction with the device, unlike in the tag clouds of the other three interaction styles. In this segment, some verbs, such as “feel”, have a negative in front of them in the original data (“do not feel”).
3.3. Sentiment Analysis
- Polarity refers to the strength of opinion that a person has in relation to the subject that is being described. It ranges between −1 (negative opinion) and +1 (positive opinion);
- Subjectivity refers to the degree to which a person is involved with the device and measures the degree to which the experience is described using facts versus personal stories. It ranges between 0 (factual conversation) and +1 (personal opinion).
3.4. Data Analysis: User and Device Agency and the Background Variables
3.5. Background Variables and Their Relationship with User Agency
3.6. Device Agency and the Background Variables
4. Discussion
4.1. User and Device Agency and Their Implications for Designing HCI Experiences
4.2. Limitations and Avenues for Future Research
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Baird, A.; Maruping, L.M. The Next Generation of Research on IS Use: A Theoretical Framework of Delegation to and from Agentic IS Artifacts. MIS Q. 2021, 45, 315. [Google Scholar] [CrossRef]
- Moore, J.W. What is the sense of agency and why does it matter? Front. Psychol. 2016, 7, 1272. [Google Scholar] [CrossRef] [PubMed]
- Limerick, H.; Coyle, D.; Moore, J.W. The Experience of agency in human-computer interactions: A review. Front. Hum. Neurosci. 2014, 8, 643. [Google Scholar] [CrossRef]
- Schneiderman, B. Direct manipulation for comprehensible, predictable and controllable user interfaces. In IUI ‘97: Proceedings of the 2nd International Conference on Intelligent User Interfaces, Orlando, FL, USA, 6–9 January 1997; Moore, J., Edmonds, E., Puerta, A., Eds.; Association for Computing Machinery: New York, NY, USA, 1997; pp. 33–39. [Google Scholar]
- Gibson, J. The Theory of Affordances. In Perceiving, Acting, and Knowing toward an Ecological Psychology; Shaw, R., Bransford, J., Shaw, R., Bransford, J., Eds.; Lawrence Erlbaum Associates: Hillsdale, NJ, USA, 1977; pp. 67–82. [Google Scholar]
- Norman, D. The Psychology of Everyday Things; Basic Books: New York, NY, USA, 1988. [Google Scholar]
- Hutchby, I. Technologies, Texts, and Affordances. Sociology 2001, 2, 441–456. [Google Scholar] [CrossRef]
- Stendal, K.; Thapa, D.; Lanamäki, A. Analyzing the concept of affordances in information systems. In Proceedings of the 49th Hawaii International Conference on System Sciences (HICSS), Koloa, HI, USA, 5–8 January 2016; pp. 5270–5277. [Google Scholar]
- Volkoff, O.; Strong, D.M. Affordance theory and how to use it in is research. In The Routledge Companion to Management Information Systems; Galliers, R.D., Stein, M.-K., Eds.; Taylor and Francis: Milton, UK, 2017; pp. 232–245. [Google Scholar] [CrossRef]
- Pozzi, G.; Pigni, F.; Vitari, C. Affordance Theory in the IS Discipline: A Review and Synthesis of the Literature. In Proceedings of the 20th American Conference on Information Systems AMCIS 2014 Proceedings, Savannah, GA, USA, 7 November 2014; Volume 13, pp. 1–12. [Google Scholar]
- Salo, M.; Pirkkalainen, H.; Chua CE, H.; Koskelainen, T. Formation and Mitigation of Technostress in the Personal Use of IT. MIS Q. 2022, 46, 1073–1108. [Google Scholar] [CrossRef]
- Kaptelinin, V.; Nardi, B. Affordances in HCI: Toward a mediated action perspective. In Proceedings of the Conference on Human Factors in Computing Systems, Austin, TX, USA, 5–10 May 2012. [Google Scholar] [CrossRef]
- Zammuto, R.F.; Griffith, T.L.; Majchrzak, A.; Dougherty, D.J.; Faraj, S. Information technology and the fabric of organization. Organ. Sci. 2007, 18, 749–762. [Google Scholar] [CrossRef]
- Bandura, A. Social Cognitive Theory: An Agentic Perspective. Annu. Rev. Psychol. 2001, 52, 21–41. [Google Scholar] [CrossRef] [PubMed]
- McCarthy, J.; Wright, P. Putting ‘felt-life’ at the centre of human–computer interaction (HCI). Cogn. Technol. Work 2005, 7, 262–271. [Google Scholar] [CrossRef]
- Rose, J.; Jones, M. The Double Dance of Agency: A Socio-Theoretic Account of How Machines and Humans Interact. Systems, Signs & Actions. Int. J. Commun. Inf. Technol. Work 2005, 1, 19–37. [Google Scholar]
- Boudreau, M.-L.; Robey, D. Enacting Integrated Information Technology: A Human Agency Perspective. Organ. Sci. 2005, 16, 3–18. [Google Scholar] [CrossRef]
- Engen, V.; Pickering, J.; Walland, P. Machine Agency in Human-Machine Networks; Impacts and Trust Implications. In Proceedings of the 18th International Conference on Human-Computer Interaction International, Toronto, ON, Canada, 17–22 July 2016; Volume 9733, pp. 96–106. [Google Scholar] [CrossRef]
- Johnson, D.; Verdicchio, M. AI, agency and responsibility: The VW fraud case and beyond. AI Soc. 2019, 34, 639–647. [Google Scholar] [CrossRef]
- Novak, T.; Hoffman, D. Relationship journeys in the internet of things: A new framework for understanding interactions between consumers and smart objects. J. Acad. Mark. Sci. 2019, 47, 216–237. [Google Scholar] [CrossRef]
- Rammert, W. Distributed Agency and Advanced Technology Or: How to Analyse Constellations of Collective Inter-Agency. In Agency without Actors? New Approaches to Collective Action; Passoth, J., Peuker, B., Schill, M., Eds.; Routledge: London, UK, 2014; pp. 89–112. [Google Scholar]
- Serrano, C.; Karahanna, E. The compensatory interaction between user capabilities and technology capabilities in influencing task performance. MIS Q. 2016, 40, 597–622. [Google Scholar] [CrossRef]
- Nevo, S.; Nevo, D.; Pinsonneault, A. A temporally situated self-agency theory of information technology reinvention. Mis Q. 2016, 40, 157–186. [Google Scholar] [CrossRef]
- Michelle, C.; Varun, G. Me. My Self, and I(T): Conceptualizating Information Technology Identity and its Implications. MIS Q. 2015, 39, 931–957. [Google Scholar]
- Berberian, B. Man-Machine teaming: A problem of agency. IFAC-PapersOnLine 2019, 51, 118–123. [Google Scholar] [CrossRef]
- Berberian, B.; Sarrazin, J.C.; Le Blaye, P.; Haggard, P. Automation technology and sense of control: A window on human agency. PLoS ONE 2012, 7, e34075. [Google Scholar] [CrossRef] [PubMed]
- McEneaney, J. Agency Attribution in Human-Computer Interaction. In Engineering Psychology and Cognitive Ergonomics EPCE 2009, Lecture Notes in Computer Science; Harris, D., Ed.; Springer: Berlin/Heidelberg, Germany, 2009; Volume 5639, pp. 81–90. [Google Scholar] [CrossRef]
- McEneaney, J. Agency Effects in Human–Computer Interaction. Int. J. Hum.-Comput. Interact. 2013, 29, 798–813. [Google Scholar] [CrossRef]
- Sato, A.; Yasuda, A. Illusion of sense of self-agency: Discrepancy between the predicted and actual sensory consequences of actions modulates the sense of self-agency, but not the sense of self-ownership. Cognition 2005, 94, 241–255. [Google Scholar] [CrossRef]
- Mick, D.G.; Fournier, S. Paradoxes of Technology: Consumer Cognizance, Emotions, and Coping Strategies. J. Consum. Res. 1998, 25, 123–143. [Google Scholar] [CrossRef]
- Schweitzer, F.; Belk, R.; Jordan, W.; Ortner, M. Servant, friend or master? The relationships users build with voice-controlled smart devices. J. Mark. Manag. 2019, 35, 693–715. [Google Scholar] [CrossRef]
- Shneiderman, B.; Plaisant, C.; Cohen, M.; Jacobs, S.; Elmqvist, N. Designing the User Interface: Strategies for Effective Human-Computer Interaction, 6th ed.; Pearson: Boston, MA, USA, 2016. [Google Scholar]
- Scott, J.E.; Walczak, S. Cognitive engagement with a multimedia ERP training tool: Assessing computer self-efficacy and technology acceptance. Inf. Manag. 2009, 46, 221–232. [Google Scholar] [CrossRef]
- Kang, H.; Kim, K.J. Feeling connected to smart objects? A moderated mediation model of locus of agency, anthropomorphism, and sense of connectedness. Int. J. Hum.-Comput. Stud. 2020, 133, 45–55. [Google Scholar] [CrossRef]
- Jia, H.; Wu, M.; Jung, E.; Shapiro, A.; Sundar, S.S. Balancing human agency and object agency: An end-user interview study of the internet of things. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp ‘12), Association for Computing Machinery, New York, NY, USA, 5 September 2012; pp. 1185–1188. [Google Scholar] [CrossRef]
- Gunkel, D.J. Communication and Artificial Intelligence: Opportunities and Challenges for the 21st Century. Communication 2012, 1. [Google Scholar] [CrossRef]
- Hassenzahl, M. The Interplay of Beauty, Goodness, and Usability in Interactive Products. Hum.-Comput. Interact. 2004, 19, 319–349. [Google Scholar] [CrossRef]
- Alter, S. Making Sense of Smartness in the Context of Smart Devices and Smart Systems. Inf. Syst. Front. 2020, 22, 381–393. [Google Scholar] [CrossRef]
- Waytz, A.; Heafner, J.; Epley, N. The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. J. Exp. Soc. Psychol. 2014, 52, 113–117. [Google Scholar] [CrossRef]
- Dillon, A. Beyond usability: Process, outcome and affect in human-computer interactions. Can. J. Libr. Inf. Sci. 2002, 26, 57–69. [Google Scholar]
- Shin, D.-H. Cross-analysis of usability and aesthetic in smart devices: What influences users’ preferences. Cross Cult. Manag. 2012, 19, 563–587. [Google Scholar] [CrossRef]
- Kim, H.-W.; Gupta, S.; Koh, J. Investigating the intention to purchase digital items in social networking communities: A customer value perspective. Inf. Manag. 2011, 48, 228–234. [Google Scholar] [CrossRef]
- Belk, R. Possessions and the Extended Self. J. Consum. Res. 1988, 15, 139–168. [Google Scholar]
- Costello, A.B.; Osborne, J. Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Pract. Assess. Res. Eval. 2005, 10, 7. [Google Scholar] [CrossRef]
- Taber, K.S. The Use of Cronbach’s Alpha When Developing and Reporting Research Instruments in Science Education. Res. Sci. Educ. 2018, 48, 1273–1296. [Google Scholar] [CrossRef]
- Cortina, J.M. What is coefficient alpha? An examination of theory and applications. J. Appl. Psychol. 1993, 78, 98. [Google Scholar]
- Kline, P. An Easy Guide to Factor Analysis; Routledge: London, UK, 2002. [Google Scholar]
- Wim, J.; Katrien, W.; Patrick, D.; Patrick, V. Marketing Research with SPSS; Prentice Hall, Pearson Education: Upper Saddle River, NJ, USA, 2008. [Google Scholar]
- Field, A. Discovering Statistics Using SPSS, 2nd ed.; Sage Publications: London, UK, 2005. [Google Scholar]
- Schaumburg, H. Computers as Tools or as Social Actors?—The Users’ perspective on Anthropomorphic Agents. Int. J. Coop. Inf. Syst. 2001, 10, 217–234. [Google Scholar]
- Lelli, F.; Heidi, T. A Survey for investigating human and smart devices relationships. 2021. [Google Scholar] [CrossRef]
Emphasis of Agency | |||||
---|---|---|---|---|---|
Humans | Intentional binding [3,27,28] | The felt life perspective aligning agency with a sense that my actions are my own [15] | Self-efficacy [14,33] | Users feeling like masters of their devices interact more with them [31] | Only humans can make intentional actions; technology is just a mediating artifact or a tool [12,16,18] |
Technology | User can be a servant of their device [31] | Humans can feel inept and stupid with technology; technology can facilitate disorder, inefficiency, enslavement, and chaos [30] | The principle of general symmetry; also, machines can be agents of their own. LaTour: Actor Network Theory. | ||
Interaction | Triadic agency includes the causal agency of artifacts and the intentional agency of humans [19] | Both humans and smart objects exercise agency, but it is different. Their relationship is built upon a sense of connectedness [34] | Both humans and technology have agency, and object agency can be internalized as a part of the individuals’ own power [35] | Actions emerge out of complicated constellations of distributed agency [21] |
High User Agency | Low User Agency | High Device Agency | Low Device Agency |
---|---|---|---|
Q_1 I’m positive about having this smart device as a part of my daily life. Construct: Feelings and enjoyment, positive affect Source: [17,27] | Q_2 I often feel victimized by my smart device, like it does things to me rather than the other way around. Construct: Feeling controlled by and incapable in relation to technology Source: [31,35] | Q_3 The smart device runs its actions independently of me/does what it does on its own. Construct: The device’s general capacity to act independently Source: [20,37] | Q_4 I feel that my smart device cannot initiate actions on its own. Construct: The device’s ability to initiate actions independently (reverse verbalization) Source: [31,34] |
Q_5 It is me, the user, who is in control of stopping tasks/processes with my smart device. Construct: Control, the feeling that one can control one’s actions to intentionally produce effects in the external world; the user’s sense of being able to choose and change the way the device does things. Source: [3,15,16,26,29,34] | Q_6 I am not able to change the way I do things if my smart device is involved; it forces a certain process on me. Construct: Acting freely with the device when reaching one’s goals Sources: [3,18] | Q_7 This smart device cannot independently change the course in how a task is completed. Construct: Human control, the sense that one coordinates the functions of the object and chooses and changes how technology works, understood as expressing the low agency of the device. Source: [34] | Q_8 My smart device has intelligence and understanding of its own, independent of me. Construct: The device having its own “consciousness” and “free will”; the device’s capacity to, e.g., deviate from expectations and adjust their behavior (internal regulation). Sources: [21,34,37] |
Q_9 I understand how this smart device works. Construct: Mastering by learning the operations Sources: [30] | Q_10 I have often had a less than ideal way of using my smart device in the past. Construct: Self-efficacy understood as one’s belief that one uses the device in the way it is supposed to be used (reverse verbalization) Sources: [33] | Q_11 My smart device is a true active participant in the interaction I have with it. Construct: Equal and interdependent interaction with a device seen as an active agent with human-like mental capacities Sources: [20,21,30,38] | Q_12 My smart device is fundamentally dependent on me, the human user. Construct: The independence of the device from the user (reverse verbalization) Source: [20] |
Q_13 I am using this smart device to execute task(s) that I have freely chosen. Construct: Acting in a free and independent manner in relation to technology when pursuing one’s goals Sources: [3,18] | Q_14 I often think that I cannot really use my smart device to achieve the things I want. Construct: Control over one’s goals when interacting with technology Sources: [3,18] | Q_15 Certain things are performed better by my smart device than by me. Construct: The human uses the device’s resources by having it act on his behalf; technological agency concerns replacing/taking over of a part of human power Sources: [14,16,18,35] | Q_16 My smart device is not really helpful in getting things done. Construct: Technology facilitates human efficiency and capacities (reverse verbalization) Sources: [16,30] |
Sex | Age | Education | Professional Status | Professional Knowledge | Professional Experience |
---|---|---|---|---|---|
Male 68.9% | 20 or younger 0.2% | Informatics, computer science, computer engineering 57.8% | High school or below 1.5% Bachelor’s 5.6% Master’s 23.8% Ph.D. 30.1% Professors 38.3% | Yes 82.8% | Yes 39.3% |
No 17.0% | No 60.5% | ||||
Female 29.9% | 21–30 years 11.2% | Humanistic, social sciences, psychology, education 12.2% | |||
Other/prefer not to say 1.0% | 31–40 years 29.8% | Business, finance, management 8.8% | |||
41–50 years 27.4% | |||||
51–60 years 20.3% | |||||
61 older 11.1% |
Factor | Loading |
---|---|
User Agency | |
Item: Q_14 | 0.737 |
Item: Q_10 | 0.532 |
Item: Q_16 | 0.488 |
Item: Q_6 | 0.420 |
Eigenvalues | 1.823 |
% of Variance | 22.79 |
Composite Reliability | 0.66 |
Device Agency | |
Item: Q_5 | 0.672 |
Item: Q_4 | 0.571 |
Item: Q_12 | 0.562 |
Item: Q_7 | 0.482 |
Eigenvalues | 2.103 |
% of Variance | 26.28 |
Composite Reliability | 0.62 |
Interconstruct Correlation | −0.135 |
Cumulative Variance Explained by both factors | 49.07% |
Q1 | Q2 | Q3 | Q4 | Q5 | Q6 | Q7 | Q8 | Q9 | Q10 | Q11 | Q12 | Q13 | Q14 | Q15 | Q16 | Device Agency | User Agency | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Q1 | 1 | |||||||||||||||||
Q2 | −0.320 ** | 1 | ||||||||||||||||
Q3 | −0.008 | 0.320 ** | 1 | |||||||||||||||
Q4 | 0.100 * | −0.152 ** | −0.259 ** | 1 | ||||||||||||||
Q5 | 0.197 ** | −0.299 ** | −0.270 ** | 0.351 ** | 1 | |||||||||||||
Q6 | −0.048 | 0.216 ** | 0.167 ** | 0.066 | −0.187 ** | 1 | ||||||||||||
Q7 | 0.087 * | −0.142 ** | −0.126 ** | 0.296 ** | 0.305 ** | 0.024 | 1 | |||||||||||
Q8 | 0.041 | 0.175 ** | 0.278 ** | −0.215 ** | −0.143 ** | 0.136 ** | −0.251 ** | 1 | ||||||||||
Q9 | 0.252 ** | −0.201 ** | −0.132 ** | 0.122 ** | 0.326 ** | −0.097 * | 0.148 ** | −0.103 * | 1 | |||||||||
Q10 | −0.218 ** | 0.299 ** | 0.068 | 0.006 | −0.153 ** | 0.217 ** | −0.051 | 0.076 | −0.162 ** | 1 | ||||||||
Q11 | 0.064 | 0.207 ** | 0.199 ** | −0.185 ** | −0.128 ** | 0.038 | −0.165 ** | 0.345 ** | 0.024 | 0.062 | 1 | |||||||
Q12 | 0.114 ** | −0.202 ** | −0.260 ** | 0.296 ** | 0.431 ** | −0.081 * | 0.267 ** | −0.247 ** | 0.185 ** | −0.130 ** | −0.182 ** | 1 | ||||||
Q13 | 0.200 ** | −0.272 ** | −0.239 ** | 0.141 ** | 0.328 ** | −0.189 ** | 0.172 ** | −0.195 ** | 0.245 ** | −0.149 ** | −0.038 | 0.311 ** | 1 | |||||
Q14 | −0.196 ** | 0.270 ** | 0.109 ** | 0.060 | −0.123 ** | 0.311 ** | 0.021 | 0.005 | −0.151 ** | 0.410 ** | 0.028 | −0.111 ** | −0.170 ** | 1 | ||||
Q15 | 0.182 ** | −0.047 | 0.038 | −0.034 | 0.023 | −0.012 | 0.033 | 0.134 ** | 0.008 | 0.009 | 0.169 ** | 0.109 ** | 0.072 | −0.085 * | 1 | |||
Q16 | −0.219 ** | 0.169 ** | 0.023 | 0.105 * | −0.024 | 0.172 ** | 0.060 | −0.027 | −0.122 ** | 0.253 ** | −0.034 | −0.036 | −0.121 ** | 0.355 ** | −0.202 ** | 1 | ||
Device Agency | 0.174 ** | −0.277 ** | −0.325 ** | 0.720 ** | 0.721 ** | −0.055 | 0.665 ** | −0.308 ** | 0.271 ** | −0.112 ** | −0.237 ** | 0.700 ** | 0.332 ** | −0.047 | 0.044 | 0.044 | 1 | |
User Agency | −0.244 ** | 0.350 ** | 0.141 ** | 0.086 * | −0.184 ** | 0.648 ** | 0.016 | 0.070 | −0.195 ** | 0.685 ** | 0.034 | −0.131 ** | −0.232 ** | 0.765 ** | −0.103 * | 0.629 ** | −0.066 | 1 |
Type of Device Chosen | n | Mean | SD | Mean Differences | ||
---|---|---|---|---|---|---|
1. Household cleaning devices | 53 | 2.83 | 0.83 | 1 personal assistant | 2 household cleaning | Confidence Interval (95%) |
2. Other household devices | 41 | 2.73 | 0.74 | 0.734 ** | 1.1911–0.2776 | |
3. Personal hygiene devices | 15 | 2.72 | 0.72 | 0.725 * | 1.3586–−0.0921 | |
4. Personal health devices | 23 | 2.59 | 0.81 | 0.596 * | 1.1430–0.0484 | |
5. Smart watches/bracelets | 175 | 2.58 | 0.79 | 0.585 ** | 0.9022–0.2676 | |
6. Computers and smartphones | 76 | 2.34 | 0.61 | −0.495 * | −0.0589–−0.9311 | |
7. Smart TVs and video-streaming devices | 112 | 2.30 | 0.76 | −0.531 ** | −0.1302–0.9324 | |
8. Personal assistant devices | 86 | 1.99 | 0.81 | 0.844** | 1.2678–0.4194 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Toivonen, H.; Lelli, F. The Varieties of Agency in Human–Smart Device Relationships: The Four Agency Profiles. Future Internet 2024, 16, 90. https://doi.org/10.3390/fi16030090
Toivonen H, Lelli F. The Varieties of Agency in Human–Smart Device Relationships: The Four Agency Profiles. Future Internet. 2024; 16(3):90. https://doi.org/10.3390/fi16030090
Chicago/Turabian StyleToivonen, Heidi, and Francesco Lelli. 2024. "The Varieties of Agency in Human–Smart Device Relationships: The Four Agency Profiles" Future Internet 16, no. 3: 90. https://doi.org/10.3390/fi16030090
APA StyleToivonen, H., & Lelli, F. (2024). The Varieties of Agency in Human–Smart Device Relationships: The Four Agency Profiles. Future Internet, 16(3), 90. https://doi.org/10.3390/fi16030090