You are currently viewing a new version of our website. To view the old version click .
Informatics
  • Article
  • Open Access

26 August 2023

Theoretical Models for Acceptance of Human Implantable Technologies: A Narrative Review

,
and
1
School of Computing and Informatics, University of Louisiana at Lafayette, Lafayette, LA 70504, USA
2
Department of Business Analytics and Technology Management, Towson University, Towson, MD 21252, USA
*
Author to whom correspondence should be addressed.
This article belongs to the Section Human-Computer Interaction

Abstract

Theoretical models play a vital role in understanding the barriers and facilitators for the acceptance or rejection of emerging technologies. We conducted a narrative review of theoretical models predicting acceptance and adoption of human enhancement embeddable technologies to assess how well those models have studied unique attributes and qualities of embeddables and to identify gaps in the literature. Our broad search across multiple databases and Google Scholar identified 16 relevant articles published since 2016. We discovered that three main theoretical models: the technology acceptance model (TAM), unified theory of acceptance and use of technology (UTAUT), and cognitive–affective–normative (CAN) model have been consistently used and refined to explain the acceptance of human enhancement embeddable technology. Psychological constructs such as self-efficacy, motivation, self-determination, and demographic factors were also explored as mediating and moderating variables. Based on our analysis, we collated the verified determinants into a comprehensive model, modifying the CAN model. We also identified gaps in the literature and recommended a further exploration of design elements and psychological constructs. Additionally, we suggest investigating other models such as the matching person and technology model (MPTM), the hedonic-motivation system adoption model (HMSAM), and the value-based adoption model (VAM) to provide a more nuanced understanding of embeddable technologies’ adoption. Our study not only synthesizes the current state of research but also provides a robust framework for future investigations. By offering insights into the complex interplay of factors influencing the adoption of embeddable technologies, we contribute to the development of more effective strategies for design, implementation, and acceptance, thereby paving the way for the successful integration of these technologies into everyday life.

1. Introduction

Human implantable technologies or embeddables represent a burgeoning field of innovation, characterized by computing devices that are surgically inserted or implanted within the human body. These devices serve multifaceted purposes, ranging from tracking health metrics to augmenting human capabilities and facilitating connections to digital technologies. Notable examples of embeddables gaining societal traction include biostamps, microchips, brain implants, password pills, and magnetic implants []. The conceptual foundation of human–technology coupling can be traced to the seminal work of American psychologist Joseph C. R. Licklider, who, in his 1960 publication Man-Computer Symbiosis [], envisioned a symbiotic relationship between humans and computers as a forthcoming stage in human evolution. Licklider’s vision encompassed a future where computer intelligence would surpass and operate independently of human control [].
In the contemporary ubiquitous computing era, where computational devices permeate every facet of daily life, embeddables are emerging as a sophisticated means to streamline communication between users and computing systems. They hold the potential to anticipate and fulfill users’ needs autonomously, possibly leading to a reduction or even elimination of on-screen interactions, thereby realizing the vision of calm computing []. The implications of embeddables extend far beyond mere functionality; they have the potential to profoundly reshape societal structures, influence human psychology, and propel intellectual advancement. Andy Goodman insightfully observes, “Embeddables are not just going to be a revolution in functionality, but will dramatically alter how people fit into society, affect human psychology, and even propel us toward intellectual transcendence” []. While these advancements promise myriad benefits and bring us closer to Licklider’s vision of human–computer symbiosis, the success of this integration is not solely dependent on the technological advancements themselves. Rather, it hinges on understanding the complex interplay between human psychology, societal norms, and technological innovation.
The attitudes, apprehensions, and expectations of individuals toward emerging technologies constitute pivotal elements that may either promote or hinder their broad adoption. Consequently, an in-depth examination of the underlying determinants that govern users’ acceptance or rejection of such technologies is an imperative scholarly pursuit. This comprehension transcends mere theoretical interest; it is vital for the actualization of the vision of human–computer symbiosis and yields additional advantages. A nuanced understanding of the reasons behind the acceptance or rejection of technologies equips researchers with the insights necessary to devise innovative methodologies for the evaluation and prediction of user attitudes toward novel technological paradigms. Furthermore, the cognizance of the multifaceted factors that influence users’ decisions to engage with specific systems can guide the conceptualization and development of future technologies []. Considering that the interaction between humans and technology is shaped by an intricate interplay of contextual and behavioral variables, the application of psychological models and theories often becomes indispensable to elucidate this complex human behavior [].
Embeddable technologies, characterized by their unique attributes such as miniaturization, invasiveness, connectivity, and potential for automation, present a distinct challenge in the field of technology acceptance. These attributes necessitate a nuanced understanding that goes beyond traditional models, capturing the multifaceted nature of human interaction with these technologies. Over the past two decades, various models and theories have been proposed to predict the acceptance of embeddables for diverse purposes, ranging from health monitoring to cognitive enhancement [,,]. However, a comprehensive and unified understanding remains elusive. Existing models often provide fragmented insights, tailored to specific contexts [,], and may lack the breadth and depth to fully encompass the complexities of embeddable technology acceptance. While insightful, they may not entirely encapsulate the unique characteristics, ethical considerations, societal implications, and psychological complexities associated with embeddable technologies. The diversity of applications, ranging from medical interventions to lifestyle enhancements, further complicates the task of understanding and predicting user acceptance. The inconsistencies and gaps in current theoretical frameworks highlight a critical need for comprehensive overview of the topic. The rapid evolution of embeddable technologies [,], coupled with their potential societal and ethical implications, underscores the urgency of this endeavor.
It is within this complex and dynamic landscape that the present study is situated. The primary objectives are to critically assess how well current technology acceptance models have addressed the unique qualities and attributes of embeddables, and to pinpoint areas that remain unexplored or underexplored. By meticulously synthesizing the existing literature, identifying key themes and patterns, and recognizing gaps, this review not only aims to propose extensions to existing theories but also to highlight relevant models and theories that may be instrumental in addressing the identified gaps. The goal transcends a mere theoretical exploration; it seeks to align technological innovation with human needs, values, and aspirations, thereby laying a foundation for future research, development, and practical implementation of embeddable technologies. This scholarly endeavor begins with an in-depth examination of the unique attributes of embeddables, unraveling their multifaceted nature. It then provides a comprehensive overview of prevailing theoretical models, weaving together insights from various disciplines and perspectives. This foundational understanding sets the stage for a critical and reflective discussion of the findings. The discussion delves into implications of the findings, explores potential avenues for future research, and offers a road map to bridge the gaps and build upon the existing knowledge base. By fostering a nuanced understanding of embeddable technology acceptance, the study aspires to contribute to a more human-centered and ethically grounded approach to technological innovation.

3. Study

The primary objective of this study is to identify trends in the development of technology acceptance models for human implantable technologies. We aim to critically assess how well current models have addressed the unique qualities and attributes of embeddable and to pinpoint gaps in theoretical development.

3.1. Materials and Methods

We conducted a comprehensive search across four primary databases: PubMed, Web of Science, IEEE Xplore, and Science Direct, in addition to Google Scholar. The keywords employed in the search included “embeddables”, “technology acceptance models”, “theoretical models”, “technology acceptance”, “insideables”, “implantables”, “insertables”, “human implantable technologies”, “biohacking”, “cyborgs”, and others related to embeddables and technology acceptance. We restricted our search to articles that were published post-2016, which led to the identification of 184 relevant pieces. After eliminating the duplicate articles, two independent researchers used the inclusion and exclusion criteria to eliminate the remaining articles by screening their titles and abstracts. This resulted in 105 text. The researchers conducted a full text review of these articles and finally selected 16 articles that met all the stipulated criteria (Table 2).
Table 2. Models and theories used to understand acceptance of embeddables.
The criteria for selection were as follows: articles that (a) explored end-user perspectives for predicting acceptance or adoption, (b) leveraged a theory or a model to understand the acceptance, rejection, or prediction of the utilization of an embeddable technology, and (c) studied insideables and / or implantables (capacity-enhancing technologies). Exclusion criteria encompassed articles that (a) focused on embeddables in nonhuman subjects (e.g., animals), (b) lacked a model, theory, or theoretical framework to investigate the acceptance of embeddables, (c) concerned with corrective human implants, such as implantable medical devices, and (d) studies about interacting with cyborgs as opposed to being cyborgs.

3.2. Analysis

The analysis began with categorizing papers according to the type of models used to study embeddable technology acceptance. The authors then assessed the specific attributes of embeddables that were studied by each model and identified the determinants that were posited to influence acceptance. Building on this, the authors mapped the attributes to the determinants, creating categories within each model. This process allowed for an understanding of how different models conceptualize the relationship between the unique qualities of embeddables and the factors that drive acceptance.
Following this categorization and mapping, the authors critically assessed the opportunities and limitations of each study, evaluating the strengths, weaknesses, and contextual considerations of each model. Finally, the insights were synthesized to identify gaps in the existing literature. By drawing together the findings from diverse studies, the authors pinpointed areas where further research was needed and opportunities for refining existing theoretical frameworks.

4. Results

A summary of all the studies has been provided in Table 2. Three main types of implantable technologies have been studied and three technology acceptance models (Table 3) have been investigated with new determinants. In addition, several studies simply focused on investigating the influence of specific determinants on the intention of use. Below, we describe our findings along with a brief overview of the technology models that have been identified.
Table 3. Comparison of the models reported in the studies.

4.1. Types of Embeddables

The following embeddables technologies have been mentioned in the literature:
  • Subcutaneous microchips (SM)—tiny integrated circuits that are about the size of a rice grain, usually encased inside transponders and placed underneath the skin. Five articles specifically explored the acceptance of subcutaneous microchip implants.
  • Capacity-enhancing nanoimplants—refers to a type of nanotechnology-based implant that can be integrated into the human body to augment or enhance certain abilities or functions. These implants are typically designed at the nanoscale, allowing for precise manipulation and interaction with biological systems.
  • Neural implants—technological devices that are implanted inside the brain to improve the memory performance of an individual. Two studies investigated people’s acceptance of neural implants for memory and performance enhancement purposes.
  • Cyborg technologies—cyborg is a frankenword that is used to describe people enhanced with both organic and digital (implantable or insideable) body parts. Cyborg technologies refer to any type of embeddable technologies that are used by a healthy individual to enhance innate human capabilities.
The remaining articles explored people’s attitudes towards implantable technologies overall, without any reference to a specific type of implantable technology.

4.2. Technology Acceptance Model

To date, TAM has been empirically established as a robust model to explain factors influencing the adoption of almost every technological innovation []. Our review shows that the embedded technology is no exception to this rule. Nine out of fourteen selected publications used TAM or some version of TAM to predict behavioral intention to use embeddables.
In its most basic form, the TAM posits that individuals rely on two cognitive processes to form biases and make decisions about whether to use a technology, even before initiating any effort towards its use. The first process is ’Perceived Usefulness’ (PU), where individuals are inclined to use a technology if they perceive it to be beneficial for achieving specific goals, such as enhancing job performance. Davis [] defines PU as the degree of this belief in an individual. If the technology is deemed useful, it leads to a positive attitude; conversely, a negative perception results in a negative attitude. The second process is ’Perceived Ease of Use’ (PEU), which concerns the individual’s understanding of the effort required to use the technology. Davis describes PEU as a measure of a person’s belief regarding the amount of effort needed to use the technology []. A belief that the technology is easy to use fosters positive attitudes towards it and vice versa.
The TAM posits that when a technology is perceived to be useful and easy to use (Figure 1), people have a positive attitude toward it, which influences their intention to use it (BIU). However, depending on an individual’s age and gender, these perceptions can vary. These factors can be expected to play a significant role even in the case of embeddable human enhancement technologies. It is conceivable that an individual is going to accept this technology, only if he or she (i) perceives it to be useful or providing some advantage (PU), and (ii) perceives it to be easy to use (PEU).
Figure 1. Technology acceptance model (basic TAM).
The TAM is a powerful framework that can be easily extended and modified to include additional determinants of behavioral intention to use a technology, TAM 2 [] and TAM 3 [] being two major upgrades to the TAM model that can be further extended and adopted for various technologies. TAM 2 focuses on fleshing out the external variables that influence PU, whereas TAM 3 focuses on defining external influences on the PEU []. Besides TAM extensions, there has also been a trend to combine the TAM with other theoretical models.
The studies utilizing the TAM to understand the intention to use embeddables have made significant strides in addressing the unique qualities and attributes of embeddable technology. Our review shows that both TAM extensions and TAM combinations have been used to predict and explain embeddable adoption. Below, we discuss how TAM has been able to treat various qualities of embeddable technologies as well as future opportunities.

4.2.1. Emphasis on Perceived Usefulness and Ease of Use

Even though the original TAM studied behavioral intention to use technology via attitudes towards that technology, most studies that use the TAM to understand the intention to use embeddables did not include attitude in their refined models. Instead, they directly studied the impact of various determinants on the behavioral intention to use. However, the remaining two core determinants, i.e., PU and PEU, have been central to many studies, which we discuss below.
Embedded technologies have uniquely useful functions, such as enhancing people’s innate capabilities and giving them computer intelligence and sensory capabilities by connecting them to the digital world. Reinares-Lara et al. [] demonstrated that the PU of embeddables has a significant impact on people’s attitude towards them. This aligns with the functional aspects of embeddables, such as their utility in enhancing human capabilities. Interestingly, Reinares-Lara et al. [] could not establish the impact of the PEU on attitudes towards brain implants, suggesting that the unique attributes of embeddables may require a nuanced understanding of the ease of use. However, Gangadharbatla [] was able to demonstrate that the PEU had an impact on attitudes towards embeddables in general but not on the willingness to use them, clearly showing that other attributes of embeddables are more important when studying the reasons behind their adoption.
Werber et al. [], true to the original TAM model, were able to demonstrate the PEU had a direct impact on PU. This can be explained by the minute sizes of these devices and their ability to be autonomous, which can diminish the concerns around their ease of use. Indeed, Cicevic et al. [], based on their quantitative descriptive results, concluded that their research subjects had positive attitudes toward the PU and PEU of microchips.

4.2.2. Incorporation of Perceived Trust and Health Concerns

Recognizing the intimate nature of embeddables and the potential health implications, several studies have extended TAM to include perceived trust (PT) and health concerns (HC). These extensions reflect a deeper understanding of the unique attributes of embeddables, such as the need for public trust in the safety of the technology and concerns about potential health risks.
Studies such as Weber et al. [] have shown that PT is a significant predictor of intention to use and HC act as a negative predictor of PU []. Gangadharbatla’s study [] included perceived risk (PR), adding depth to the understanding of how perceived health and safety risks impact acceptance. They were able to demonstrate that PR, including health risks, can influence attitudes and the willingness to adopt embeddables. Cicevic et al. [] found that participants in their study had a very low PT of microchips, while HC were neutral. However, no one was willing to use microchips for everyday activities at home. Qualitative studies such as that of Shafeie et al. [] provide nuanced insights into how health and safety considerations shape the willingness to adopt microchip implants. The authors concluded that even though the public finds diverse applications of microchips useful, scientific research has not advanced enough to garner public trust in the safety of embeddable technology [,].
Overall, this theme provides a cohesive understanding of how health and safety considerations influence the acceptance of embeddable technologies. By recognizing and exploring this theme, researchers have acknowledged the unique potential risks associated with the integration of embeddables within the human body.

4.2.3. Consideration of Privacy and Security

Trust in the safety and privacy of embeddable technology is crucial, given their intimate connection with the user. The close integration of embeddables with the human body and their ability to exchange data with external devices introduce potential risks related to data privacy and security. Their ability to exchange data with other entities also make them susceptible to misinformation, conspiracy theories, and fake news.
Gangadharbatla’s study [] established the role of perceived risk (PR) and privacy concerns (PrC) in influencing attitudes and the willingness to adopt embeddables. Studies by Weber et al. [] and Cicevic et al. [] explored PT as a determinant of acceptance, reflecting the need for public trust in the technology’s ability to safeguard personal information. Z̆nidars̆ic̆ et al. [] investigated how fake news and misinformation impact PT, adding complexity to the understanding of privacy concerns.
These studies have recognized the multifaceted nature of privacy concerns, ranging from data security to public trust and misinformation. This consolidated understanding underscores the importance of addressing privacy considerations in the design, implementation, and promotion of embeddable technologies. It also emphasizes the need for transparent communication, robust security measures, and public education to foster trust and mitigate privacy concerns among potential users. The exploration of privacy concerns in these studies contributes valuable insights to the ongoing discourse on the responsible development and acceptance of embeddable technologies.

4.2.4. Exploration of Demographics and Individual Factors

Some studies have also explored the influence of individual factors such as age, gender, race, and socioeconomic status. This aligns with the recognition that embeddables may have different implications for different individuals, requiring personalized considerations. Gangadharbatla’s [] findings indicated that factors such as age, gender, and self-efficacy in embedded technology also influence the adoption of embedded technology. Whereas, Weber et al. [] reported that age does not moderate acceptance. Such conflicting findings challenge previous beliefs and underscores the need for a nuanced understanding and investigation of the influence of demographic factors and embeddable technology type on acceptance.

4.2.5. Limitations and Opportunities

The TAM presents unique opportunities for studying the adoption of embeddable technologies, but there are areas where further exploration is needed. Many existing studies within this category do not distinguish between different types of insideables, nor do they adequately address the varying levels of ethical and other dilemmas associated with them. This presents an opportunity for future research to delve into specific uses of implantable technologies and to investigate how attitudes towards their use are influenced by factors such as privacy concerns, ethical judgments, and the degree of invasiveness of the implant. Future research could also explore the PT and HC theme, delving deeper into the specific concerns and how they interact with other factors to shape acceptance.
One promising avenue for exploration is the analysis of the influence of ethical judgment, depending on the degree of invasiveness of the implant and the extent of human transformation they entail from an evolutionary perspective. This could lead to a more nuanced understanding of the ethical considerations that underpin people’s acceptance or rejection of these technologies.
Moreover, the majority of existing studies have focused on young higher-education students, limiting the generalizability of findings. Future research should strive to include different segments of the population and various characteristics for a broader societal understanding. This could involve targeting older adults, particularly those with limited computer access, and considering factors such as education, socioeconomic status, health, cultural background, profession, religion, and other relevant variables.
By expanding the scope of the TAM to encompass these areas, researchers can develop a more comprehensive and contextually relevant model for understanding the adoption of embeddable technologies. This could lead to more effective strategies for promoting responsible use and addressing the complex ethical, social, and psychological issues that these technologies raise. In summary, while the TAM has already contributed valuable insights into the adoption of embeddable technologies, there is significant potential for further refinement and application in this rapidly evolving field.

4.3. Cognitive–Affective–Normative Model

The cognitive–affective–normative (CAN) model (Figure 2) has been a significant approach in understanding the acceptance of embeddable technology. Developed by Pelegrin-Borondo et al. [], the model stresses the importance of looking at both cognitive and affective elements [], or cognitive, affective, and normative elements []. It attempts to account for both technology and human abilities by combining cognitive (PU and PEU) and normative (subjective norm) determinants of behavioral intention from TAM 1 and TAM 2 with three affective variables, i.e., positive emotions, negative emotions, and anxiety. CAN has been repeatedly extended and used to study the acceptance of various types of capacity-enhancing embeddable technologies. For example, it has been used to assess behavioral intention toward being a cyborg [], interacting with a cyborg [], accepting a brain implant for one’s children [], etc.
Figure 2. Cognitive–affective–normative (CAN) model.
The CAN model was tested via a self-administered, online survey in 600 individuals from an unselected population []. The structural equation modeling demonstrated that the model could account for 73.92% of the variance in responses concerning the intention to use embeddables in early adopters, that is, individuals who use an innovation before others. In the following, we demonstrate how well the studies utilizing the CAN model have addressed the issue of embeddable technology acceptance based on their unique qualities and attributes.

4.3.1. Embracing Emotional Responses

The CAN model’s affective determinant recognizes that embeddable technologies can elicit mixed emotional responses in users. This acknowledgment of both positive and negative emotions reflects an understanding of the complex human reactions to the idea of integrating technology within the body. By considering emotions, the CAN model goes beyond traditional models such as TAM, capturing a more nuanced view of human interaction with embeddables. With the help of the CAN model [], it has been shown that the affective determinant is the strongest antecedents for early adopters, and positive emotions are the most influential out of all the other emotions.

4.3.2. Incorporating Social Influence

The normative dimension of the CAN model emphasizes the influence of societal views on individual attitudes towards embeddables. Given the novelty and potential controversy surrounding implantable devices, this aspect of the model is particularly relevant. It acknowledges that acceptance is not solely a matter of personal preference but is shaped by broader social norms and expectations. Pelegrin et al. [] showed that the normative determinant was the strongest antecedents for early adopters.

4.3.3. Ethical Considerations

While the CAN model itself does not inherently capture ethical aspects, some studies have integrated ethical constructs within the model to investigate concerns related to personal identity, security, and privacy. This integration reflects an awareness of the unique ethical dilemmas posed by embeddables, such as the potential impact on human psychology and the moral considerations surrounding the use of such technologies. Ethical dilemmas are complex because they exist in a multitude of domains, such as social, economic, environmental, educational, moral, etc. Regardless of the domain of interest, individuals use a subjective process known as ethical judgment, which considers moral equity, relativism, egoism, utilitarianism, and contractualism, to decide which actions have moral groundings [].
The studies conducted by Pelegrín-Borondo et al. [], Olarte-Pascual et al. [], and Reinares-Lara et al. [] explored the ethical dimensions of embeddable technology acceptance. They found that ethical judgment, including principles such as moral equity, relativism, egoism, and utilitarianism, has strong positive relationships with the intention to use insideables, while contractualism has a weak, nonsignificant relationship. Reinares-Lara et al. [] investigated ethical concerns related to brain implants, such as personal identity, security, and privacy, and categorized participants based on their ethical stances. Although they did not find a moderating effect of the ethical construct on acceptance, the ethical assessment did explain variations in behavioral intention.
The CAN model offers a comprehensive framework that aligns well with the unique qualities and attributes of embeddable technologies. By considering cognitive, affective, normative, and ethical dimensions, it provides a multifaceted view of acceptance. The studies utilizing the CAN model have contributed valuable insights into the acceptance of embeddable technologies, reflecting an understanding of their unique characteristics and the complex human responses they elicit.

4.3.4. Limitations and Opportunities

Existing research reflects general beliefs about implantable technologies as participants were not given additional information that could influence perceptions and emotions. The studies relied on self-assessed emotions, which could have impacted the collection of data that depended on emotional responses. The existing research also indicates that the moderating effect of culture on the ethical construct needs further investigation. The investigation of cultural differences has begun to receive attention, but regions such as Oceania, Africa, the Middle East remain unexplored and would benefit from further examination. Moreover, while the published work has examined the connections between cognitive, affective, and normative factors in evaluating the acceptance of implantable technologies, underlying reasons for the relationships between these variables could be further explored.

4.4. Unified Theory of Acceptance and Use of Technology

Venkatesh et al. [] developed the UTAUT to assess the consumer’s intention to use a technology. The model unifies eight prominent theories: TRA, TPB, MPCU, MM, TAM, SCT, IDT, and TAM-TPB. The UTAUT contains four core determinants to explain and predict technology adoption: performance expectancy (PE) (a combination of PU, extrinsic motivation, relative advantage, job-fit and outcome expectations), effort expectancy (EE) (equivalent to the PEU), social influence (SI) (equivalent to social norms), and facilitating conditions (FC) (referring to an individual’s belief that they can find organizational and technical support to use the system). The individual factors of gender, age, experience (previous technology experience), and voluntariness (willingness) of use abate the strength of primary relationships in the model (Figure 3). While the original eight models and their extensions explain between 30 and 60% of the variation in behavioral intention to accept a technology, the UTAUT explains an improved 70% of the variation.
Figure 3. Unified theory of acceptance and use of technology (UTAUT).
An extension of the UTAUT called UTAUT 2 [] provides three additional determinants to explain behavioral intention to use a system: hedonic motivation (HM), price value (PV), and habits (H). Additionally, the UTAUT 2 model dropped voluntarinessof use from the set of moderating variables. The multitude of determinants and moderating variables improves the predictive and explanatory power of the UTAUT, but at the expense of increasing the complexity of the model application. Therefore, the moderating variables are often ignored to simplify the model application process. It is also to be noted that UTAUT 2 is more suitable for assessing the acceptance of technologies within the consumer market, whereas the UTAUT focuses on the use of technologies within an organizational setting. Therefore, it is not surprising that UTAUT 2 is used to explain the adoption of embeddables by consumers.

4.4.1. UTAUT Modification for Implantable Technology

The studies that utilize the UTAUT to address the issue of embeddable technology acceptance have provided valuable insights into acceptance issues related to the unique qualities and attributes of this technology.
Boella, Girju, and Gurviciute [] developed an extension of UTAUT 2 to explain the acceptance of microchip implants, focusing on university students and experts. They identified five new determinants of microchip adoption: functionality, health, invasiveness, privacy, and safety. These determinants reflect the views of young consumers in Sweden and highlight the importance of PE in the willingness to adopt microchips. However, the study’s limitation lies in its lack of consideration for personal factors such as gender, age, and cultural contexts, which can provide a more comprehensive understanding of microchip adoption.
Sabogal-Alfaro et al. [] applied UTAUT 2 to investigate the intention to use nonmedical insertable devices in Colombia and Chile. Their findings revealed that four factors (HM, H, PE, and SI) positively influenced the intention to use embeddables in both countries, with habit mediating the relationship between SI and intention to use. EE was found to be significant only in Chile. This study emphasizes the importance of self-efficacy factors such as habits and HM in emerging technologies, indicating that they may be more influential than traditional predictors.
Arias-Oliva et al. [] employed the UTAUT 2 model to examine the intentions of 1563 higher education students from seven countries to use wearables and insideables. The study concluded that the model’s explanatory power is highly valuable for both researchers and decision-makers. Specifically, they found that PE positively influences the intention to use both wearables and insideables, while EE does not have a significant impact on either. SI was found to significantly and positively affect the intention to use insideables, but not wearables. HM had a significant positive influence on both, whereas FC did not influence either. The authors concluded that usefulness drives the adoption of accepted technologies, whereas societal approval is key for emerging, disruptive technologies.
Overall, the application of the UTAUT in these studies has helped in understanding the multifaceted nature of embeddable technology acceptance. It has shed light on the specific factors that influence acceptance, such as functionality, health considerations, privacy concerns, and the role of habits. However, there is room for further exploration, especially in considering personal and cultural factors, to provide a more nuanced understanding of how different populations perceive and accept embeddable technologies.

4.4.2. Limitations and Opportunities

Overall, the UTAUT has not been extensively applied in the context of embeddable technology. This presents both a limitation and an opportunity for future research. Similar to the challenges encountered with other models, studies employing the UTAUT in this domain may face issues such as a lack of generalizability due to focusing on specific populations or particular types of embeddable technologies. However, the UTAUT’s comprehensive framework, which considers factors such as PE, EE, SI, and FC, could provide a more nuanced understanding of the adoption of embeddable technologies. By exploring the influence of moderating factors such as age, culture, experience, and voluntariness of use, UTAUT could uncover insights that are missed by other models. This could lead to a more tailored approach for promoting or regulating embeddable technologies, taking into account the diverse needs and values of different user groups. Thus, while the application of the UTAUT to embeddable technology is still in its infancy, it holds a significant potential for advancing our understanding of this complex and rapidly evolving field.

4.5. Psychological Constructs

The studies that utilize psychological constructs to address the issue of embeddable technology acceptance have taken diverse approaches to explore the unique qualities and attributes of this technology. The following psychological issues and their relationship to embeddable technology acceptance have been studied.

4.5.1. Technology Anxiety and Privacy Concerns

Pramatari and Theotokis [] have shown that individual traits such as technology anxiety and information privacy concerns negatively affect attitudes towards RFID-enabled services. This highlights the importance of addressing psychological barriers in technology adoption. Gangadharbatla [] established through their survey with 1,063 individuals from an unselected population that self-efficacy in using embeddable technologies influence people’s decision to adopt it.

4.5.2. Personality Dimensions

Earlier studies had hypothesized links between personality dimensions such as extraversion, sensing, thinking, and judging, and the inclination toward embedding RFID chips []. In a recent study with 111 undergraduate students, Chebolu [] confirmed that there is no significant relationship between the five personality traits and intended use. Other individual factors such as education, religion, biological sex, and race or ethnicity also had no significant relationship with intended use.

4.5.3. Ethical Awareness and Cultural Considerations

Murata et al. [] explored how ethical awareness, perceptions of innovativeness, and perceived risk affected the decision to become a cyborg, specifically comparing the distinct cultures of Japan and Spain. Surprisingly, ethics emerged as the most influential factor, and no statistically significant cultural differences were found between the two countries. This study emphasized the universal role of ethics but may have overlooked nuanced cultural views on body modification due to its focus on a specific population.

4.5.4. Motivation and Trust

Chebolu [] incorporated psychological constructs of motivation and trust to explore user attitudes towards implantable technology. They found positive correlations between technology competence, relatedness, autonomy, and intended use, and confirmed trust as a mediating factor. However, no significant relationships were found with personality traits or other individual factors, suggesting that motivation and trust may be more central to acceptance.

4.5.5. Perfectionism and Locus of Control

Ahadzadeh et al. [] conducted a survey with 647 Spanish university students between the ages of 18 and 30 to explore the impact of psychological factors on the intention to use memory implants for enhancement purposes. Their findings revealed that both neurotic and normal perfectionism have a positive effect on the inclination to use memory implants. Additionally, an internal locus of control was positively linked to the behavioral intention to utilize these implants for enhancement. The relationship between normal perfectionism and the intention to use memory implants was found to be moderated by locus of control, while the connection between neurotic perfectionism and the intention to use memory implants was not influenced by locus of control. These findings reveal complex relationships between these factors, providing insights into the psychological nuances of technology acceptance.

4.5.6. Implicit Psychosocial Drivers

According to Freudian theory, drivers can be understood as unconscious needs that are at the heart of human motivation. Giger and Gaspar [] have identified seven implicit psychosocial drivers or motivators for intention toward practicing body hacking. This includes existential drivers, identity drivers, ideological drivers, cognitive drivers, affective drivers, utilitarian drivers, social affiliation drivers, and epistemic drivers. However, the lack of empirical testing of their model leaves unanswered questions about the impact of these drivers.
In essence, studies have explored aspects such as ethics, motivation, trust, perfectionism, and cultural perceptions, making significant strides in understanding the multifaceted psychological factors influencing embeddable technology acceptance.

4.5.7. Limitations and Opportunities

There are only a few studies that consider the impact of psychological factors on the adoption of implantable technologies. The complexity of human psychology and the novelty of embeddable technologies make this a rich area for continued research, and a more comprehensive understanding of these factors could greatly inform the design and marketing of embeddable technologies [,]. An opportunity for future research lies in the integration of more nuanced psychological theories with technology acceptance models, such as those concerned with psychological readiness and individual values. For example, psychological readiness could explore an individual’s mental preparedness to accept embeddables, their adaptability, and their openness to change. This could be particularly relevant in the context of technologies that might alter human capabilities or the human body itself. Individual values, including ethical or moral beliefs, could also profoundly affect decisions about technology adoption. For instance, some individuals might reject memory implants on principle, regardless of their perceived usefulness or ease of use, due to deeply held beliefs about human nature, dignity, or the sanctity of the mind. Overall, as new types of embeddables continue to change, investigating personal variables that affect the adoption process will remain important and may benefit from additional psychological models and theories.

5. Discussion

We presented a narrative review of studies exploring behavioral intention to accept or adopt embeddable technology using theoretical or model-based approaches. We classified the existing research into four major categories, corresponding to three main theoretical models for technology acceptance (TAM, CAN, and UTAUT) and a fourth category exploring the moderating role of psychological constructs. Below, we discuss the implications of our review and findings.

5.1. Principal Findings

Wolbring et al. [] conducted a literature review in 2013 to investigate technology acceptance models for cognitive/neuro enhancers, social robots, and brain–computer interfaces. They found that these models were not being used to understand the acceptance or rejection of such technologies. Nearly a decade later, our review shows progress in developing technology acceptance models for embeddable technologies. The field has gravitated towards extending and modifying the TAM to explain the acceptance of embeddable technology. Despite this progress, much work remains to investigate the acceptance and consequences of implantable human enhancement technology.
Based on our review, it is clear that the TAM has provided insights into how users perceive the functional benefits of embeddable technologies. However, the unique attributes of embeddables, such as their intimate integration with human physiology and potential ethical dilemmas, may not be fully captured by the TAM. The model’s emphasis on usability might overlook the complex interplay of psychological, social, and ethical factors that influence the acceptance of embeddables. While the TAM can provide a foundational understanding, its application to embeddables may require extensions or modifications to encompass the multifaceted nature of these technologies.
The CAN model introduces cognitive, affective, and normative factors into the understanding of technology acceptance. This model aligns more closely with the qualities and attributes of embeddables, considering not only cognitive perceptions but also emotional responses and social norms. The CAN model can capture the emotional impact of embeddables, such as how they might enhance or disrupt self-identity, and the societal perceptions that may influence acceptance or rejection. However, the CAN model might still fall short in addressing the full spectrum of ethical considerations and individual customization needs that are inherent to embeddables. A further refinement and integration of ethical frameworks might be necessary to fully align the CAN model with the complexities of embeddable technology acceptance.
The UTAUT integrates multiple constructs to provide a more comprehensive understanding of technology acceptance. In the context of embeddables, the UTAUT’s consideration of factors such as social influence, facilitating conditions, and individual differences aligns well with the diverse attributes of these technologies. The UTAUT can capture the influence of societal norms, individual preferences, and the supporting environment on the acceptance of embeddables. However, like the other models, the UTAUT may still need further adaptation to fully address the unique ethical considerations, design complexities, and potential human–machine hybridity associated with embeddables.
While the TAM, CAN, and UTAUT have contributed valuable insights into the acceptance of embeddable technologies, they may not fully encompass the unique qualities and attributes of these innovations. The intimate integration with human bodies, potential enhancement capabilities, ethical dilemmas, and individual customization needs present challenges that may require a further refinement and integration of these models. Future research could focus on extending these models or developing new frameworks that more accurately reflect the multifaceted nature of embeddable technology acceptance, considering both technological capabilities and the broader human, societal, and ethical context.
We have collated the verified determinants into a single model as a modification to the CAN model, which explains up to 77% variance in embeddable technology acceptance (Table 4). These determinants are classified into six broad categories: cognitive, affective, normative, behavioral, ethical, and technical. Since embeddable technology is still emerging, myriad determinants and moderating variables may still need exploration. Ultimately, a technology acceptance model that thoughtfully considers the design aspects of embedded technology is likely to be more effective and reflective of real-world user behavior and preferences [].
Table 4. Determinants of the behavioral intention to use embedded technology. *** Not yet investigated for their impact.
Furthermore, it is imperative to delve into a range of communication and social psychology theories to uncover the moderators and mediators that influence the intention to use embeddable technologies. Our review highlights a noticeable gap in the literature, particularly in the exploration of the relationship between psychological dispositional constructs and the intention to embrace embeddable technology. This deficiency calls for a comprehensive investigation into various factors that may play a role in this context. These factors encompass not only individual characteristics such as age, gender, education, health conditions, experience, occupation, and religion but also broader cultural contexts that shape attitudes and behaviors. Understanding these multifaceted influences can provide a more nuanced and complete picture of how and why people may choose to adopt or reject embeddable technologies, thereby informing more effective strategies for design, implementation, and acceptance [].
Some researchers criticize technology acceptance models for ignoring technology design as a possible determinant, which may have an impact on user acceptance [,]. Particularly, many believe that technology acceptance models tend to focus on presenting factors that are only appropriate for measuring social acceptance of a technological product while it is still being developed. Since people tend to appropriate technologies according to their contexts, technology acceptance models employing a simplistic understanding of acceptance can create many problems. We can expect people to invent many new uses of embeddable technology once they are readily available in the market. Therefore, it is recommended that researchers complement this line of research with qualitative methods, to uncover information on the designs wanted and needed by the target audience and to understand what can make this technology more desirable. In essence, a technology acceptance model that is sensitive to design issues of embedded technology may end up being more efficacious [].
Based on the above discussion, we have added a new determinant, i.e. Technology Design for further exploration. We define design as the physical characteristics of a technology, including, but not limited to: (a) how data (input and output) exchange takes places between the user and technology, (b) the aesthetics and physical appearance of the technology, and (c) level of user’s control over the technology functioning and appearance. We recommend researchers to investigate additional design elements that may impact user’s decision to accept or reject embeddable enhancement technology.

5.2. Research Implications

The field of human–computer interaction (HCI) [] emphasizes the importance of incorporating human needs and emotions in technology design, advocating for a user-centered approach. While traditional models have provided valuable insights into technology adoption, there is an opportunity to expand this research by exploring other models that offer unique perspectives on user interaction and experience. Three such models, the matching person and technology model (MPTM) [], the hedonic-motivation system adoption model (HMSAM) [], and the value-based adoption model (VAM) [], could provide distinct lenses through which to study the adoption of implantable technologies.
The MPTM posits that the effective adoption of technology depends on the compatibility between the user and the technology. In the context of implantable technologies, this means the device must align with the user’s requirements, preferences, and abilities. For example, an active young person might prefer an implantable fitness device, while an elderly person with limited mobility might benefit from a pacemaker with remote monitoring. The MPTM emphasizes individual user differences, advocating for technology adaptation to suit specific needs. Since the existing research still lacks an exploration of specific embeddable technologies and what different demographics might expect from them, this model can provide useful insights into the design and development of future embeddable technologies.
Contrastingly, the HMSAM focuses on both functional benefits and emotional experiences in technology adoption. It suggests that emotional experiences, influenced by aesthetics, enjoyment, and social influence, drive adoption. In the context of implantable technologies, this means the embeddable technology must fulfill its function and provide a pleasant emotional experience. For example, a cochlear implant might restore hearing and enhance social interactions, improving overall quality of life. The existing research currently lacks an exploration of this emotional aspect of embeddable technology; therefore, HMSAM could be an appropriate model to enhance our understanding of embeddable technologies.
Both MPTM and HMSAM provide valuable insights into implantable technology adoption. The MPTM emphasizes user differences and the need for adaptive technology, while the HMSAM emphasizes emotional experience. By considering both models, researchers can gain a comprehensive understanding of adoption factors and guide the development of well-suited, functional, and emotionally satisfying devices.
The VAM adds another layer to this understanding by focusing on individual values, beliefs, and ethics. Unlike traditional models, the VAM recognizes that an individual’s unique values guide their decisions, such as weighing enhanced capabilities against privacy concerns or ethical beliefs about human enhancement. In the context of embeddable technologies, these values shape perceptions and attitudes, influencing adoption. For example, a high value on privacy might lead to negative attitudes towards adoption. The VAM also considers social and cultural factors, recognizing that societal norms shape values. Some versions of the VAM integrate perceived risk and trust as influencing factors, recognizing that these can significantly impact the perceived value of a technology. Overall, the VAM takes a more holistic approach by considering both extrinsic and intrinsic values. While other models might focus on functional aspects such as efficiency and effectiveness (extrinsic values), the VAM also considers intrinsic values such as enjoyment and personal satisfaction.
In summary, the MPTM, HMSAM, and VAM offer nuanced approaches to understanding the complex interplay of psychology, values, and social influences in the adoption of embeddable technologies. The MPTM emphasizes compatibility and individual differences, the HMSAM focuses on emotional experience, and the VAM explores individual values and ethics. Together, these models provide a comprehensive framework for studying the adoption of embeddable technologies, with practical implications for design, marketing, policy, and ethics. By integrating these models, researchers and practitioners can design technology that resonates with users’ needs, preferences, and values, creating technology more likely to be successfully adopted and utilized.
Embeddables are in an evolutionary stage, and it is clear that we still know very little about the acceptance of this technology in humans. Until embeddables become mainstream, they will be seen as a futuristic technology that can enhance humans in ways that can only be imagined at this point. Therefore, any theoretical model developed to explain the acceptance and adoption of embeddables is incomplete. Bagozzi, Davis, and Warshaw state: “Because new technologies such as personal computers are complex and an element of uncertainty exists in the minds of decision makers with respect to the successful adoption of them, people form attitudes and intentions toward trying to learn to use the new technology prior to initiating efforts directed at using. Attitudes towards usage and intentions to use may be ill-formed or lacking in conviction or else may occur only after preliminary strivings to learn to use the technology evolve” []. Therefore, even if a model is able to capture all relevant determinants, how people use the product may not be directly related to the attitudes and intentions resulting from these determinants.

6. Limitations

Since this was a narrative review, we conducted a rather broad literature review, including a range of embeddable technologies. It is possible that several related studies were not included. Moreover, we only considered manuscripts that tested their proposed models with potential users. A few manuscripts proposed new models but did not test them out with the potential users or were used by decision makers; therefore, they were not included in this review.

7. Conclusions

The rapid advancement of embeddable technologies has opened new horizons for human enhancement, convenience, and medical applications. However, the adoption of these technologies is a complex phenomenon, influenced by a myriad of factors ranging from individual needs and emotions to societal norms and ethical considerations. This research paper sought to explore this multifaceted landscape, shedding light on the theoretical models and practical implications that govern the acceptance and rejection of embeddable technologies.
Our comprehensive review of the existing literature revealed a significant gravitation towards extending and modifying traditional models such as the TAM, with a focus on explaining the acceptance of embeddable technology. We identified gaps in the current understanding, particularly in the areas of technology design, individual psychology, and the nuanced interplay of values and ethics.
We have gathered and organized the verified determinants into a single model and advocate for testing out new determinants, i.e., technology design, hedonism, and price value. In addition, we suggest exploring the MPTM, the HMSAM, and the VAM to obtain a more nuanced perspective on technology adoption. These models emphasize the importance of compatibility, emotional experience, and individual values, offering a richer framework for understanding user behavior. Moreover, our research highlights the need for further exploration and integration of qualitative methods, psychological theories, and social psychology constructs. Such an approach can uncover deeper insights into user needs and desires, leading to more efficacious technology acceptance models.
In conclusion, the adoption of embeddable technologies is a complex and evolving field, requiring a multifaceted approach that considers not only functionality but also human psychology, emotions, values, and ethics. Our research contributes to this understanding by offering a comprehensive review and proposing new avenues for exploration. As embeddable technologies continue to advance and become more mainstream, the insights gained from this research will be instrumental in guiding their successful integration into our lives, ensuring that they are not only technologically innovative but also human-centered, ethical, and socially responsible.

Author Contributions

Conceptualization, B.M.C. and M.M.; methodology, B.M.C. and M.M.; formal analysis, S.S. and B.M.C.; writing—original draft preparation, S.S. and B.M.C.; writing—review and editing, S.S., B.M.C. and M.M.; supervision, B.M.C. All authors have read and agreed to the submitted version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors’ employers are not involved in the reporting and conducting of this study. The authors and their employers have no conflict of interest to declare.

References

  1. Gangadharbatla, H. Biohacking: An exploratory study to understand the factors influencing the adoption of embedded technologies within the human body. Heliyon 2020, 6, e03931. [Google Scholar] [CrossRef] [PubMed]
  2. Licklider, J.C. Man-computer symbiosis. IRE Trans. Hum. Factors Electron. 1960, HFE-1, 4–11. [Google Scholar] [CrossRef]
  3. Bardini, T. Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing; Stanford University Press: Redwood City, CA, USA, 2000. [Google Scholar]
  4. Weiser, M. The computer for the 21st century. ACM Sigmobile Mob. Comput. Commun. Rev. 1999, 3, 3–11. [Google Scholar] [CrossRef]
  5. Goodman, A. Embeddables: The Next Evolution of Wearable Tech. In Designing for Emerging Technologies: UX for Genomics, Robotics, and the Internet of Things; Follett, J., Ed.; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2014; Chapter 8; pp. 205–224. [Google Scholar]
  6. Alomary, A.; Woollard, J. How is technology accepted by users? A review of technology acceptance models and theories. In Proceedings of the 5th International Conference on 4E, London, UK, 8–12 June 2015. [Google Scholar]
  7. Taherdoost, H. A review of technology acceptance and adoption models and theories. Procedia Manuf. 2018, 22, 960–967. [Google Scholar] [CrossRef]
  8. Werber, B.; Baggia, A.; Žnidaršič, A. Factors affecting the intentions to use RFID subcutaneous microchip implants for healthcare purposes. Organizacija 2018, 51, 121–133. [Google Scholar] [CrossRef]
  9. Mohamed, M.A. Modeling of Subcutaneous Implantable Microchip Intention of Use. In Proceedings of the International Conference on Intelligent Human Systems Integration, Modena, Italy, 19–21 February 2020; pp. 842–847. [Google Scholar]
  10. Shafeie, S.; Chaudhry, B.M.; Mohamed, M. Modeling Subcutaneous Microchip Implant Acceptance in the General Population: A Cross-Sectional Survey about Concerns and Expectations. Informatics 2022, 9, 24. [Google Scholar] [CrossRef]
  11. Cristina, O.P.; Jorge, P.B.; Eva, R.L.; Mario, A.O. From wearable to insideable: Is ethical judgment key to the acceptance of human capacity-enhancing intelligent technologies? Comput. Hum. Behav. 2021, 114, 106559. [Google Scholar] [CrossRef]
  12. Z̆nidars̆ic̆, A.; Werber, B.; Baggia, A.; Vovk, M.; Bevanda, V.; Zakonnik, L. The Intention to Use Microchip Implants Model Extensions after the Pandemics. In Proceedings of the the 16th International Symposium on Operational Research in Slovenia, Bled, Slovenia, 22–24 September 2021; pp. 247–252. [Google Scholar]
  13. Sparks, H. Pentagon Develops Implant that could Help Detect COVID under Your Skin; New York Post: New York, NY, USA, 2022. [Google Scholar]
  14. Hart, R. Elon Muskś Neuralink Wants to Put Chips in Our Brains—How It Works and Who Else Is Doing It; Forbes: Jersey City, DC, USA, 2023. [Google Scholar]
  15. Warwick, K. Implants; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
  16. Kiyoshi, M.; Andrew, A.A.; Yasunori, F.; Yohko, O.; Mario, A.O.; Jorge, P.B. From a Science Fiction to the Reality: Cyborg Ethics in Japan. Orbit J. 2017, 1, 1–15. [Google Scholar] [CrossRef]
  17. Heersmink, R. The Philosophy of Human-Technology Relations. Philos. Technol. 2018, 31, 305–319. [Google Scholar]
  18. Hansson, S.O. Implantable Computers: The Next Step in Computer Evolution? Ethics Inf. Technol. 2005, 7, 115–126. [Google Scholar]
  19. Gray, C.H. Cyborg Citizen: Politics in the Posthuman Age; Routledge: London, UK, 2001. [Google Scholar]
  20. Grunwald, A. Nano- and Information Technology: Ethical Aspects. Int. J. Technol. Assess. Health Care 2004, 20, 15–23. [Google Scholar]
  21. Foster, K.R.; Jaeger, J. Ethical Implications of Implantable Radiofrequency Identification (RFID) Tags in Humans. Am. J. Bioeth. 2005, 5, 6–7. [Google Scholar] [CrossRef] [PubMed]
  22. Thomsen, N. Technology Acceptance for Hearing Aids: An Analysis of Adoption and Innovation; Aalborg University: Copenhagen, Denmark, 2021. [Google Scholar]
  23. Pommer, B.; Zechner, W.; Watzak, G.; Ulm, C.; Watzek, G.; Tepper, G. Progress and trends in patients’ mindset on dental implants. II: Implant acceptance, patient-perceived costs and patient satisfaction. Clin. Oral Implant. Res. 2011, 22, 106–112. [Google Scholar] [CrossRef] [PubMed]
  24. Venkatesh, V.; Thong, J.Y.; Xu, X. Unified Theory of Acceptance and Use of Technology: A Synthesis and the Road Ahead. J. Assoc. Inf. Syst. 2016, 17, 328–376. [Google Scholar] [CrossRef]
  25. Ibáñez-Sánchez, S.; Orus, C.; Flavian, C. Augmented reality filters on social media. Analyzing the drivers of playability based on uses and gratifications theory. Psychol. Mark. 2022, 39, 559–578. [Google Scholar] [CrossRef]
  26. Falgoust, G.; Winterlind, E.; Moon, P.; Parker, A.; Zinzow, H.; Madathil, K.C. Applying the uses and gratifications theory to identify motivational factors behind young adult’s participation in viral social media challenges on TikTok. Hum. Factors Healthc. 2022, 2, 100014. [Google Scholar] [CrossRef]
  27. Ajzen, I.; Kruglanski, A.W. Reasoned action in the service of goal pursuit. Psychol. Rev. 2019, 126, 774. [Google Scholar] [CrossRef]
  28. Go, H.; Kang, M.; Suh, S.C. Machine learning of robots in tourism and hospitality: Interactive technology acceptance model (iTAM)—Cutting edge. Tour. Rev. 2020, 75, 625–636. [Google Scholar] [CrossRef]
  29. Alfadda, H.A.; Mahdi, H.S. Measuring students’ use of zoom application in language course based on the technology acceptance model (TAM). J. Psycholinguist. Res. 2021, 50, 883–900. [Google Scholar] [CrossRef]
  30. Zhou, J.; Fan, T. Understanding the factors influencing patient E-health literacy in online health communities (OHCs): A social cognitive theory perspective. Int. J. Environ. Res. Public Health 2019, 16, 2455. [Google Scholar] [CrossRef]
  31. Chen, C.C.; Tu, H.Y. The effect of digital game-based learning on learning motivation and performance under social cognitive theory and entrepreneurial thinking. Front. Psychol. 2021, 12, 750711. [Google Scholar] [CrossRef]
  32. Pousada García, T.; Garabal-Barbeira, J.; Porto Trillo, P.; Vilar Figueira, O.; Novo Díaz, C.; Pereira Loureiro, J. A framework for a new approach to empower users through low-cost and do-it-yourself assistive technology. Int. J. Environ. Res. Public Health 2021, 18, 3039. [Google Scholar] [CrossRef] [PubMed]
  33. Jader, A.M.A. Factors Affecting the Behavioral Intention to Adopt Web-Based Recruitment in Human Resources Departments in Telecommunication Companies in Iraq. Al-Anbar Univ. J. Econ. Adm. Sci. 2022, 14, 404–416. [Google Scholar]
  34. Zheng, K.; Kumar, J.; Kunasekaran, P.; Valeri, M. Role of smart technology use behaviour in enhancing tourist revisit intention: The theory of planned behaviour perspective. Eur. J. Innov. Manag. 2022. ahead of print. [Google Scholar] [CrossRef]
  35. Choe, J.Y.; Kim, J.J.; Hwang, J. Innovative robotic restaurants in Korea: Merging a technology acceptance model and theory of planned behaviour. Asian J. Technol. Innov. 2022, 30, 466–489. [Google Scholar] [CrossRef]
  36. Li, L. A critical review of technology acceptance literature. Ref. Res. Pap. 2010, 4, 2010. [Google Scholar]
  37. Yuen, K.F.; Wong, Y.D.; Ma, F.; Wang, X. The determinants of public acceptance of autonomous vehicles: An innovation diffusion perspective. J. Clean. Prod. 2020, 270, 121904. [Google Scholar] [CrossRef]
  38. Venkatesh, V.; Davis, F.D. A theoretical extension of the technology acceptance model: Four longitudinal field studies. Manag. Sci. 2000, 46, 186–204. [Google Scholar] [CrossRef]
  39. Olushola, T.; Abiola, J. The efficacy of technology acceptance model: A review of applicable theoretical models in information technology researches. J. Res. Bus. Manag. 2017, 4, 70–83. [Google Scholar]
  40. Van der Heijden, H. User acceptance of hedonic information systems. Mis Q. 2004, 28, 695–704. [Google Scholar] [CrossRef]
  41. Liao, Y.K.; Wu, W.Y.; Le, T.Q.; Phung, T.T.T. The integration of the technology acceptance model and value-based adoption model to study the adoption of e-learning: The moderating role of e-WOM. Sustainability 2022, 14, 815. [Google Scholar] [CrossRef]
  42. Venkatesh, V.; Bala, H. Technology acceptance model 3 and a research agenda on interventions. Decis. Sci. 2008, 39, 273–315. [Google Scholar] [CrossRef]
  43. Venkatesh, V.; Thong, J.Y.; Xu, X. Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Q. 2012, 36, 157–178. [Google Scholar] [CrossRef]
  44. Chang, A. UTAUT and UTAUT 2: A review and agenda for future research. Winners 2012, 13, 10–114. [Google Scholar] [CrossRef]
  45. Lowry, P.B.; Gaskin, J.; Twyman, N.; Hammer, B.; Roberts, T. Taking ‘fun and games’ seriously: Proposing the hedonic-motivation system adoption model (HMSAM). J. Assoc. Inf. Syst. 2012, 14, 617–671. [Google Scholar] [CrossRef]
  46. Hu, L.; Filieri, R.; Acikgoz, F.; Zollo, L.; Rialti, R. The effect of utilitarian and hedonic motivations on mobile shopping outcomes. A cross-cultural analysis. Int. J. Consum. Stud. 2023, 47, 751–766. [Google Scholar] [CrossRef]
  47. Bagozzi, R.P.; Baumgartner, J.; Yi, Y. An investigation into the role of intentions as mediators of the attitude-behavior relationship. J. Econ. Psychol. 1989, 10, 35–62. [Google Scholar] [CrossRef]
  48. Eagly, A.H.; Chaiken, S. The Psychology of Attitudes; Harcourt Brace Jovanovich College Publishers: New York, NY, USA, 1993. [Google Scholar]
  49. Mallat, N. Exploring consumer adoption of mobile payments—A qualitative study. J. Strateg. Inf. Syst. 2007, 16, 413–432. [Google Scholar] [CrossRef]
  50. Kim, Y.; Park, Y.; Choi, J. A study on the adoption of IoT smart home service: Using Value-based Adoption Model. Total Qual. Manag. Bus. Excell. 2017, 28, 1149–1165. [Google Scholar] [CrossRef]
  51. Demiris, G.; Oliver, D.P.; Washington, K.T. Defining and analyzing the problem. In Behavioral Intervention Research in Hospice and Palliative Care: Building an Evidence Base; Academic Press: Cambridge, MA, USA, 2019; pp. 27–39. [Google Scholar]
  52. Reinares-Lara, E.; Olarte-Pascual, C.; Pelegrín-Borondo, J.; Pino, G. Nanoimplants that enhance human capabilities: A cognitive-affective approach to assess individuals’ acceptance of this controversial technology. Psychol. Mark. 2016, 33, 704–712. [Google Scholar] [CrossRef]
  53. Pelegrin-Borondo, J.; Reinares-Lara, E.; Olarte-Pascual, C. Assessing the acceptance of technological implants (the cyborg): Evidences and challenges. Comput. Hum. Behav. 2017, 70, 104–112. [Google Scholar] [CrossRef]
  54. Reinares-Lara, E.; Olarte-Pascual, C.; Pelegrín-Borondo, J. Do you want to be a cyborg? The moderating effect of ethics on neural implant acceptance. Comput. Hum. Behav. 2018, 85, 43–53. [Google Scholar] [CrossRef]
  55. Dragović, M. Factors Affecting RFID Subcutaneous Microchips Usage. In Proceedings of the Sinteza 2019-International Scientific Conference on Information Technology and Data Related Research; Singidunum University: Belgrade, Serbia, 2019; pp. 235–243. [Google Scholar]
  56. Murata, K.; Arias-Oliva, M.; Pelegrín-Borondo, J. Cross-cultural study about cyborg market acceptance: Japan versus Spain. Eur. Res. Manag. Bus. Econ. 2019, 25, 129–137. [Google Scholar]
  57. Boella, N.; Gîrju, D.; Gurviciute, I. To Chip or Not to Chip? Determinants of Human RFID Implant Adoption by Potential Consumers in Sweden & the Influence of the Widespread Adoption of RFID Implants on the Marketing Mix. Master’s Thesis, Lund University, Lund, Sweden, 2019. [Google Scholar]
  58. Gauttier, S. ‘I’ve got you under my skin’—The role of ethical consideration in the (non-) acceptance of insideables in the workplace. Technol. Soc. 2019, 56, 93–108. [Google Scholar] [CrossRef]
  59. Pelegrín-Borondo, J.; Arias-Oliva, M.; Murata, K.; Souto-Romero, M. Does ethical judgment determine the decision to become a cyborg? J. Bus. Ethics 2020, 161, 5–17. [Google Scholar] [CrossRef]
  60. Žnidaršič, A.; Baggia, A.; Pavlíček, A.; Fischer, J.; Rostański, M.; Werber, B. Are we Ready to Use Microchip Implants? An International Cross-sectional Study. Organizacija 2021, 54, 275–292. [Google Scholar] [CrossRef]
  61. Sabogal-Alfaro, G.; Mejía-Perdigón, M.A.; Cataldo, A.; Carvajal, K. Determinants of the intention to use non-medical insertable digital devices: The case of Chile and Colombia. Telemat. Inform. 2021, 60, 101576. [Google Scholar] [CrossRef]
  62. Arias-Oliva, M.; Pelegrín-Borondo, J.; Murata, K.; Gauttier, S. Conventional vs. disruptive products: A wearables and insideables acceptance analysis: Understanding emerging technological products. Technol. Anal. Strateg. Manag. 2021, 1–13. [Google Scholar] [CrossRef]
  63. Chebolu, R.D. Exploring Factors of Acceptance of Chip Implants in the Human Body. Bachelor’s Thesis, University of Central Florida, Orlando, FL, USA, 2021. [Google Scholar]
  64. Ahadzadeh, A.S.; Wu, S.L.; Lee, K.F.; Ong, F.S.; Deng, R. My perfectionism drives me to be a cyborg: Moderating role of internal locus of control on propensity towards memory implant. Behav. Inf. Technol. 2023, 1–14. [Google Scholar] [CrossRef]
  65. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
  66. Boughzala, I. How Generation Y Perceives Social Networking Applications in Corporate Environments. In Integrating Social Media into Business Practice, Applications, Management, and Models; IGI Global: Hershey, PA, USA, 2014; pp. 162–179. [Google Scholar]
  67. Pelegrín-Borondo, J.; Reinares-Lara, E.; Olarte-Pascual, C.; Garcia-Sierra, M. Assessing the moderating effect of the end user in consumer behavior: The acceptance of technological implants to increase innate human capacities. Front. Psychol. 2016, 7, 132. [Google Scholar] [CrossRef]
  68. Oliva, M.A.; Borondo, J.P. Cyborg Acceptance in Healthcare Services: Theoretical Framework. In Proceedings of the Paradigm Shifts in ICT Ethics: Proceedings of the ETHICOMP* 2020; Universidad de La Rioja: Logroño, Spain, 2020; pp. 50–55. [Google Scholar]
  69. Nguyen, N.T.; Biderman, M.D. Studying ethical judgments and behavioral intentions using structural equations: Evidence from the multidimensional ethics scale. J. Bus. Ethics 2008, 83, 627–640. [Google Scholar] [CrossRef]
  70. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef]
  71. Pramatari, K.; Theotokis, A. Consumer acceptance of RFID-enabled services: A model of multiple attitudes, perceived system characteristics and individual traits. Eur. J. Inf. Syst. 2009, 18, 541–552. [Google Scholar] [CrossRef]
  72. Perakslis, C.; Michael, K.; Michael, M.; Gable, R. Perceived barriers for implanting microchips in humans: A transnational study. In Proceedings of the 2014 IEEE Conference on Norbert Wiener in the 21st Century (21CW), Boston, MA, USA, 24–26 June 2014; pp. 1–8. [Google Scholar]
  73. Giger, J.C.; Gaspar, R. A look into future risks: A psychosocial theoretical framework for investigating the intention to practice body hacking. Hum. Behav. Emerg. Technol. 2019, 1, 306–316. [Google Scholar] [CrossRef]
  74. Wolbring, G.; Diep, L.; Yumakulov, S.; Ball, N.; Yergens, D. Social robots, brain machine interfaces and neuro/cognitive enhancers: Three emerging science and technology products through the lens of technology acceptance theories, models and frameworks. Technologies 2013, 1, 3–25. [Google Scholar] [CrossRef]
  75. Salovaara, A.; Tamminen, S. Acceptance or appropriation? A design-oriented critique of technology acceptance models. In Future Interaction Design II; Springer: Berlin/Heidelberg, Germany, 2009; pp. 157–173. [Google Scholar]
  76. Holden, H.; Rada, R. Understanding the influence of perceived usability and technology self-efficacy on teachers’ technology acceptance. J. Res. Technol. Educ. 2011, 43, 343–367. [Google Scholar] [CrossRef]
  77. Card, S.K.; Moran, T.P.; Newell, A. The Psychology of Human-Computer Interaction; CRC Press: Cleveland, OH, USA, 2018. [Google Scholar]
  78. Martinez, A.P.; Scherer, M.J. Matching Person & Technology (MPT) Model” for Technology Selection as well as Determination of Usability and Benefit from Use; Department of Physical medicine & Rehabilitation, University of Rochester Medical Center: Rochester, NY, USA, 2018; p. 1 3140.
  79. Kim, H.W.; Chan, H.C.; Gupta, S. Value-based adoption of mobile internet: An empirical investigation. Decis. Support Syst. 2007, 43, 111–126. [Google Scholar] [CrossRef]
  80. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User acceptance of computer technology: A comparison of two theoretical models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.