Technologies 2013, 1(1), 3-25; doi:10.3390/technologies1010003

Article
Social Robots, Brain Machine Interfaces and Neuro/Cognitive Enhancers: Three Emerging Science and Technology Products through the Lens of Technology Acceptance Theories, Models and Frameworks
Gregor Wolbring 1,*, Lucy Diep 1,, Sophya Yumakulov 1,, Natalie Ball 2, and Dean Yergens 1
1
Department of Community Health Sciences, University of Calgary, Calgary, AB T2N4N1, Canada; E-Mails: ldiep@ucalgary.ca (L.D.); sophya.yumakulov@gmail.com (S.Y.); dyergens@ucalgary.ca (D.Y.)
2
Faculty of Medicine, University of Calgary, Calgary, AB T2N4N1, Canada; E-Mail: neball@ucalgary.ca
These authors contributed equally to this work.
*
Author to whom correspondence should be addressed; E-Mail: gwolbrin@ucalgary.ca; Tel.: +1-403-210-7083; Fax: +1-403-220-6494.
Received: 24 February 2013; in revised form: 21 May 2013 / Accepted: 28 May 2013 /
Published: 10 June 2013

Abstract

: Social robotics, brain machine interfaces and neuro and cognitive enhancement products are three emerging science and technology products with wide-reaching impact for disabled and non-disabled people. Acceptance of ideas and products depend on multiple parameters and many models have been developed to predict product acceptance. We investigated which frequently employed technology acceptance models (consumer theory, innovation diffusion model, theory of reasoned action, theory of planned behaviour, social cognitive theory, self-determination theory, technology of acceptance model, Unified Theory of Acceptance and Use of Technology UTAUT and UTAUT2) are employed in the social robotics, brain machine interfaces and neuro and cognitive enhancement product literature and which of the core measures used in the technology acceptance models are implicit or explicit engaged with in the literature.
Keywords:
social robotics; assistive robotics; brain machine interface; brain computer interface; neuroprosthetics; neuroenhancement; cognitive enhancement; technology acceptance model; disabled people; people with disabilities; therapeutics; therapeutic enhancements

1. Introduction

Social robotics is a rapidly growing field which offers innovative and ever-more complex technologies for use within a range of sectors including education, healthcare and service [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16]. Brain machine interfaces (BMI), another evolving field, has future prospects with helping disabled people, assisting military soldiers, and for gaming purposes [17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32]. The technology involves the interaction of human thought with an external device (e.g., robot, robotic limb, smart wheelchair, communication device) which translates and executes an action of the user’s intent [33,34]. This can be achieved through invasive (surgical) or non-invasive (non-surgical) procedures. The clinical viability of BMI technology for disabled people is determined by a cost (surgical risks, financial accessibility, reliability) benefit (improvement of quality of life) analysis [35,36,37,38]. BMI technology can be used by disabled people to gain functions seen as species-typical [39,40,41] but at the same time also gives disabled people beyond species-typical abilities (therapeutic enhancement). At the same time BMI could also be used by non-disabled people by the “healthy” to gain the beyond species-typical abilities (non-therapeutic enhancement). Finally, there is an increasing discussion around human enhancement beyond the normal or species-typical [42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68], and particularly around neuro and cognitive enhancement of the “healthy” [69,70,71,72,73,74,75,76]. The three products covered in this paper belong to a new category of therapeutic products––therapeutic products that can lead to therapeutic and non-therapeutic enhancements [43,60,61,77,78,79].

Acceptance of ideas and products whether sold as consumables or therapeutics depends on several parameters. As to consumer goods, parameters include: the perception of its utility, quality, and value [80], risk perceptions and consumer concerns [81] and the weighing of risks and benefits [82]. For example, Frewer concludes that “the most important determinant of consumer acceptance of genetic engineering in food technology is likely to be perceptions of benefit resulting from application of the technology” [83]. According to Greenhalgh et al. and others, over 1,000 papers on the diffusion, spread and sustainability of innovation in health service organisations exist [84].

Many models have been developed to predict product acceptance. We will use the following models to focus on the user acceptance of social robotics, BMI, and neuro and cognitive enhancements: consumer theory [85,86,87,88]; innovation diffusion theory (IDT) [89,90,91,92]; theory of reasoned action (TRA) [93,94,95,96,97,98,99]; theory of planned behaviour (TPB) [100,101,102,103]; social cognitive theory (SCT) [104,105,106,107,108]; self-determination theory (SDT) [109,110,111,112,113,114,115,116,117]; technology of acceptance model (TAM) [118,119,120,121,122,123,124,125,126]; Unified Theory of Acceptance and Use of Technology (UTAUT) [127,128,129,130,131,132,133,134,135,136,137]; UTAUT2 [138]; model of PC utilization (MPCU) [137,139]; and motivational model (MM) [137]. Consumer theory is defined as the relationship between consumer preference of goods and services and expenditure; grounded in sociology, the IDT model “emphasizes the process by which an innovation or new knowledge is accepted or rejected by a particular group or organization over time” [91]; TRA model is based on the social psychology of human behavior and is used to predict behaviors according to an individual’s attitude toward the behavior (positive or negative feelings) and subjective norm (whether a behavior should or should not be performed according to the individual’s perception of what people closest to them think); TPB is an extension of the TRA model with the inclusion of perceived behavioral control which “refers to people’s perception of the ease or difficulty of performing the behavior of interest” [97]; SCT looks at the relationship between individual behavior, the environment, and people that influence an individual’s acquisition and maintenance of behavioral patterns; SDT refers to the intrinsic (e.g., inherent satisfaction, personal interest) and extrinsic factors (e.g., compliance, rewards/punishment) for motivation; TAM is the perceived usefulness and ease of use that influence an individual’s acceptance and use of technology; UTAUT refers to how well new technology will be embraced by users based on the factors of performance expectancy, effort expectancy, social influence and facilitating conditions moderated by gender, age, experience and voluntariness of use [138]; UTAUT2 is an extension of UTAUT including hedonic motivation (i.e., perceived enjoyment), price value, and experience and habit without voluntariness as a moderating variable [138]; MPCU is derived from TRA and TPB with a competitive construct to predict user behavior of PCs; and MM refers to intrinsic and extrinsic motivational factors to understanding user adoption of new technology. Each model employs a variety of core measurements to ascertain consumer acceptance.

We present the results of our evaluation of social robotics, brain machine interfaces and neuro/cognitive enhancement literature through the lens of core measures of the various technology acceptance models.

2. Experimental Section

Table 1 describes the search strategies for the BMI, social robotics, and enhancement literature. The search took place in May 2012 and RIS files of all the articles found were imported into Knowledge Share (KSv2) [140]. This tool was used to systematically review the literature (abstracts of articles) by process of inclusion/exclusion based on the following criteria: For BMI––include: full PDF available, English language, exclude: books, conference announcements, purely technical articles; social robotics––include: full PDF available, English language, exclude: books, conference announcements, purely technical articles; neuroenhancement and cognitive enhancement––include: full-text available, English language, within humans and non-rehabilitative (focus on increasing capabilities beyond the “normal”, exclude: books, conference announcements, purely technical articles. Articles were reviewed for inclusion separately by two researchers; Kappa scores were calculated and any disagreements were addressed individually until a consensus could be reached for the article in question.

Articles which were agreed upon by two researchers were imported into Atlas.ti 7.0.75 a qualitative analysis software and underwent thematic analysis, wherein the texts were coded for content related to the core measures of the different technology acceptance models. Generally, the core measures of each model were evident implicitly in the articles; very few articles discussed or mentioned the actual models or core measures explicitly. The literature was analyzed by two researchers to increase the reliability of our findings.

Table 1. Literature search summary.

Click here to display table

Table 1. Literature search summary.
TopicDatabasesKeywords# Articles Found# Articles IncludedKappa
BMIScienceDirect, Scopus, OVID (All), EBSCO (All), Web of Science, and JSTOR“brain machine interface”1,058710.99
Social roboticsScience Direct, Compendex, IEEE, Communication Abstracts, Scopus, OVID(All), EBSCO(All), Academic One File, Web of Science, and JSTOR“social robot”4891710.88
Neuroenhancement and cognitive enhancementJSTOR, ScienceDirect, PubMed, EBSCO, Academic Search Complete, Web of Science and Scopus (Elsevier)“neuro-enhancement”, “cognitive enhancement”361 for neuro-enhancement 1,022 for cognitive enhancement61 for neuro-enhancement 82 for cognitive enhancement0.90 for neuro-enhancement 0.79 for cognitive enhancement

3. Limitation

For BMI technology and social robotics, our search terms were limited to “brain machine interface” and “social robotics” respectively, which excluded articles that use different terms such as “brain-computer interface” or “companion robot” to just name two terms. We do not claim that our results cover all the literature for a given product as we used only one term in our search but not all possible search terms (e.g., the term, “brain machine interface” is also referred to as: “neuro prosthesis” or “brain computer interface”) and we did not search every academic database available. However we did perform Google Scholar searches for the keyword combination of for example “brain computer interface” and “technology acceptance model” to see whether it would generate hits that are different to for example using “brain machine interfaces” and “technology acceptance model”. Ten more hits were generated with using BCI instead of BMI. Then we know about the work of for example the group of Nijboer et al. which does work pertinent to our research question [141,142,143]. The output of this group did not appear in the 1,058 BMI articles although Nijboer et al. were cited a few times by others. Nijboer et al. work relevant to our research question however also did not show up if we changed our search strategy by replaced BMI with BCI in our database searches. Nijboer stated among others that it is important to investigate user experience but that it is rarely done so in current BMI research and that investigating user centered approaches would increase user acceptance [141,142,143]. In short if one does not cover every article and every database with the least limiting keyword it is to be expected that one might miss some work especially in fields where the terminology is still evolving as is the case in the social robotics and BMI field. However while our search terms limited the articles captured we believe the sample we have is large enough to allow us to reach some conclusions allowing future research that could compare our results with results one might obtain with other keyword combinations. Furthermore we submit that the landscape of results will change down the road as the fields we investigated are still evolving whereby our results reflect the time till May 2012.

4. Results

First we evaluated whether the following technology acceptance models (TRA; TAM; MM; TPB; MPCU; IDT; SCT; UTAUT; UTAUT2; consumer choice; SDT) have been employed already to investigate consumer sentiment toward the three science and technology products (social robots, brain machine interfaces and neuro/cognitive enhancers) covered in this paper. Two of the articles covering social robots looked at the UTAUT model [144,145]; the articles that mentioned the UTAUT model mentioned also TAM and some articles mention TAM as well as TRA, and TPB [146,147]. No article covering brain machine interfaces and neuro/cognitive enhancers mentioned any of the technology acceptance models by name. This finding indicates a possible area of analysis and research around social robots, brain machine interfaces and neuro/cognitive enhancers.

Secondly we looked at whether discourses around social robots, brain machine interfaces and neuro/cognitive enhancers covered aspects that are seen as core measures within the various technology acceptance models even if the models themselves are not mentioned by name. Table 2 reveals the frequency of coverage of various core measures within the articles covering social robots, brain machine interfaces and neuro/cognitive enhancers. In general the neuro and cognitive enhancement discourses covered more of the core measures than social robotics or brain machine interfaces. However if a core measure was covered it was often only covered by one article so there is space for improvement for a more solid foundation of discussion of the core measures even in the neuro/cognitive enhancement discourse. Table S1, found in the supplementary file, give concrete examples from the articles and the 78 references of the articles that covered core measures.

Table 2. Core Measures of technology assessment models covered in the Neuroenhancement, Cognitive Enhancement, Brain Machine Interface and Social Robotics discourse (n = number of articles).

Click here to display table

Table 2. Core Measures of technology assessment models covered in the Neuroenhancement, Cognitive Enhancement, Brain Machine Interface and Social Robotics discourse (n = number of articles).
TheoryCore Measure of a Given TheoryNeuro-EnhancementBMI for Enhancement PurposesBMI for Restorative PurposesSocial Robotics
Theory of Reasoned Action (TRA)Attitude Toward It
(an individual’s positive or negative feeling about performing the target behavior)
n = 1n = 1n = 11n = 9
Subjective Norm the person’s perception that most people who are important to him think he should or should not perform the behavior in questionn = 1n = 1n = 4n = 2
Technology acceptance model (TAM)Perceived usefulness
(increase job performance)
n = 2n = 1n = 2n = 7
Perceived ease of usen = 2n = 1n = 2n = 1
Subjective normn = 4n = 1n = 1
Motivational modelExtrinsic motivation
(users want to perform an action because it’s perceived to achieve a valued outcome that is outside of the activity like jobs…)
n = 5n = 1n = 3n = 1
Intrinsic motivation
(for no external reason but purely for the process of performing the activity per se)
n = 10 n = 1
Theory of Planned behaviorAttitude towards behaviorn = 1 n = 8
Subjective normn = 4n = 1n = 1
Perceived behavioral control
(the perceived ease or difficulty of performing the behavior)
(perception of internal and external constraints on behaviour)
n = 1 n = 2
Model of Personal Computer (PC) utilizationJob fitn = 8 n = 2
Complexities n = 3
Long term consequences
(pay-off in the future)
n = 19 n = 13
Affect towards use (positive or negative feeling towards it)n = 1 n = 7
Social factors (internalization of the reference group subjective culture, and interpersonal arguments)n = 3 n = 2
Facilitating conditions
Objective factors in the environment that observers agree make an act easy to accomplish
n = 5 n = 1n = 13
Innovation Diffusion Theory Relative advantagen = 18 n = 1n = 3
Image (enhance one’s image)n = 9n = 1 n = 1
Visibility
(one can see others using it)
n = 4
Compatibility
(Perceived as being consistent with the existing values needs and past experiences of potential adopters )
n = 14
Results demonstrabilityn = 5
Voluntariness of usen = 4
Social cognitive theoryOutcome expectations––Performance
(consequences)
n = 10 n = 1n = 2
Outcome expectation personal
Personal consequence such as esteem and sense of accomplishment
n = 10 n = 1n = 1
Self-efficacyn = 2 n = 3
Affect (liking)n = 1
Anxietyn = 1
UTAUTPerformance Expectancy n = 10
Effort Expectancy n = 1
Social Influencen = 4
Facilitating Conditions
Age Gender, Experience, Voluntariness
n = 1 n = 6
UTAUT2Performance Expectancy n = 12
Effort Expectancy n = 4
Social Influencen = 1
Facilitating Conditions
Age Gender Experience
n = 1
Hedonistic Motivationn = 13 n = 1n = 2
Price Valuen = 3
Habitn = 3
Behavioural Intention n = 6
Consumer choiceSocial factors
Family
Friends
Other people
Trends
Gender
Age
Entertainment
n = 1 n = 4
Environmental factors
Lifestyle
n = 5 n = 2
Personal factors
Needs
Wants
Likes
Time
Values
Emotion
Knowledge
Hobbies
n = 3n = 1n = 2n = 3
Economic factors
Affordability
Value for money
n = 2 n = 1
Psychological factors
Planned buying
Impulse buying
To bribe award or encourage someone
Emotions
Celebration
Advertisement
n = 6 n = 2
Social determination theoryIntrinsic motivation
Competence
Autonomy
Relatedness
Interest
Enjoyment
Inherent satisfaction
n = 9 n = 1n = 6
External motivation
Compliance
External rewards and punishment
n = 11n = 1n = 1

Table 2 and Table S1 in the supplementary file reveal that the following core measures were not mentioned at all.

The following core measures have not been covered within social robotics: intrinsic motivation (for no external reason but purely for the process of performing the activity per se); subjective norm; complexities; visibility (one can see others using it); compatibility (perceived as being consistent with the existing values needs and past experiences of potential adopters); results demonstrability; voluntariness of use; affect (liking); anxiety; performance expectancy; effort expectancy; social influence; facilitating conditions (age gender, experience, voluntariness); hedonistic motivation; habit; behavioural intention; social factors (family, friends, other people, trends, gender, age, entertainment); environmental factors (lifestyle); personal factors (needs, wants, likes, time, values, emotion, knowledge, hobbies); psychological factors (planned buying, impulse buying, to bribe award or encourage someone, emotions, celebration, advertisement) and intrinsic motivation (competence, autonomy, relatedness, interest, enjoyment, inherent satisfaction).

As to neuro and cognitive enhancement we did not find articles covering voluntariness and behavioural intention. Only one study really looked at social factors. Competitiveness is the one environmental factor covered.

As for BMI articles, the following core measures have not been covered: long term consequences (pay-off in the future); affect towards use (positive or negative feeling towards it); social factors (family, friends, other people, trends, gender, age, entertainment) (internalization of the reference group subjective culture, and interpersonal arguments); voluntariness of use; self-efficacy; affect (liking); anxiety; performance expectancy; effort expectancy; social influence; facilitating conditions (age, gender experience) and hedonistic motivation. Some core measures are mentioned within the restorative discourse of brain machine interfaces but not within the enhancement discourse of brain machine interfaces. For example: environmental factors (lifestyle); personal factors (needs, wants, likes, time, values, emotion, knowledge, hobbies); psychological factors (planned buying, impulse buying, to bribe award or encourage someone, emotions, celebration, advertisement); intrinsic motivation (competence, autonomy, relatedness, interest, enjoyment, inherent satisfaction) and external motivation (compliance and external rewards and punishment).

As for the core measures mentioned (Table S1 in supplemental file) the sentiment revealed in all our case studies was positive and negative. In the case of neuro/cognitive enhancement the core measures of attitude and subjective norm were negatively covered with perceiving negative consequences. Economic gain, competitiveness and efficiency were mentioned as external motivators and performance at school and work were mentioned as internal motivators. These were mentioned factual without judgment. As to long term consequences some mentioned were positive such as economic gain and some were negative such as increased inequity, undetermined safety and increase norms making the non-compliant feeling deficient. As to factors that make it easier to take up enhancement positive media portrayal was mentioned. As to the core measure of compatibility it was seen that our desire for being competitive, being efficient and being able to achieve economic gain was fitting well with neuroenhancers whereas neuroenhancers were seen as incompatible with values of safety, free will and equality. Male were seen of being more in favor of enhancements. As to psychological factors coercion was mentioned as was commercialization and media portrayal.

In the case of BMI applications, authenticity was revealed under the core measure of attitude toward the use of BMI as an enhancement. The attitude toward authenticity is viewed as the genuineness of one’s skills, talents and abilities if it is enhanced by BMI technology, and the authenticity of ‘free choices’ with the pressures of consumerism of technology was thematized [25]. Authenticity was revealed as an extrinsic motivation in the MM model defined as one’s ability to “lead a more authentic life” through the use of BMI as an enhancement. This shows a sentiment toward how one is perceived using BMI as an enhancement versus what one envisions as possibilities with its use. Through the TRA model, the subjective norm of BMI for the purpose of enhancement is around the sentiment of the novelty of BMI technology and weighing costs and benefits but from the perspective of using BMI for restorative purposes, the subjective sentiment is more favorable toward potential improvement of one’s quality of life.

In the case of social robotics there was particular focus on core measure related to attitudes toward social robots (n = 9 in TRA, n = 8 in TPB, n = 7 in PC utilization), perceived usefulness (n = 7), long-term consequences (n = 13 in PC utilization), and facilitating conditions such as age (n = 13 in PC utilization, n = 6 in UTAUT). In general, articles explored positive personal experiences with robots such as entertainment value, pleasure gains, and increased independence, with some noting that older generations are generally less accepting of technology [148], and overly human-like robots would be perceived as creepy [149,150]. Perceived usefulness included social robots taking over dangerous jobs, and relieving human resource pressures (especially in eldercare).

5. Discussion

5.1. Uses of Technology Acceptance Models

The Technology Acceptance Model (TAM) was initially developed to explain what makes end-users accept a wide range of computing technologies [135]. TAM focuses on perceived ease of use and perceived usefulness. In the original 1989 paper, Davis defined perceived ease of use as “the degree to which a person believes that using a particular system would be free from effort” and perceived usefulness as “the degree to which a person believes that using a particular system would enhance his or her job performance” [135] and proposed that perceived ease of use affects perceived usefulness. One study using TAM to look at human-computer interactions in high school students in Taiwan found that perceived ease of use can predict perceived usefulness and perceived usefulness can predict perceived ease of use; moreover, perceived ease of use and perceived usefulness can predict attitude toward using [151].

Five studies in the BMI literature we covered mentioned ease of use as important for BMI success [28,37,152,153,154] whereby these studies focused on the non-invasive version of BMI’s. Indeed others looked at the linkage of invasiveness and acceptability outside BMI [155]. As to perceived usefulness, it is recognized within the BMI literature that “perceived usability can be described by many factors including ease of use, controllability (i.e., speed, accuracy, responsiveness, sensitivity), effectiveness of technology, effectiveness of mental approaches, cosmesis, and overall satisfaction” [152]. Various articles covered perceived usefulness whereby this was mostly discussed within the framework of whether an invasive or non-invasive BMI version is more useful. One study investigating the perception of special education teachers on BMI [41] found that ease of use was a major concern that special education teachers voiced in regards to their disabled students using a BMI. The same study revealed that special education teachers envisioned that BMI’s have the potential of being useful by furthering the independence, social participation and quality of life of the special education student [41] but they also believed that invasiveness decreases the usefulness of BMI’s. However special education teachers also voiced other parameters which one would not list under the two parameters of TAM such as affordability, need for being invisible as they feared that their students would get ostracized if their new device were too visible, reliability of the device and concern related to reaction of peers [41].

The issue of “ease of use” is also covered in the social robotics literature. Seven social robotics articles mentioned ease of use as important [13,144,146,148,150,156,157]. As to perceived usefulness, n = 16 articles covered this angle with two articles reporting on survey results [158,159]. However within the social robots literature it is also realized that one has to go beyond ease of use and perceived usefulness. One article states that there is a need for “design guidelines as to how to develop socially acceptable robots to be used for social skill intervention for children with ASD (autism spectrum disorder)” [160]. Given the realization that TAM is too limited some of the social robotics literature engaged with technology acceptance models which was not evident in the BMI literature. Some articles looked at the UTAUT model [144,145,146,147] which went beyond ease of use and perceived usefulness including various other factors. The Almere model was developed by modifying the UTAUT model further to adapt it to social robotics special needs by adding measures such as perceived sociability, social presence, social influence and perceived enjoyment [144]. Another study added user needs as a parameter [161]. Others went on to modify the Almere model further whereby the authors followed the concept of product ecology [162].

As to neuro/cognitive enhancements, none of the technology acceptance models were mentioned. We also obtained no hits if we use “consumer acceptance models” which is another area of inquiry that tries to understand consumer behavior and might be useful. Interestingly, Donovan, Egger, Kapernick and Mendoza investigated in 2002 what might prevent so called non-disabled athletes from taking illegal performance-enhancing drugs in Sport. They found that the likelihood of drug use will be highest when (a) threat appraisal is low (b) benefit appraisal is high (c) personal morality is neutral (d) perceived legitimacy of the laws and enforcement agency is low (e) relevant reference groups are supportive of drug use and (f) high vulnerability on personality factors (e.g., low self-esteem, risk taker, pessimist) [163].

We submit that acceptance models have a useful role to play in informing not only researchers and engineers but also policy makers as to how a product should be developed, what might drive demand for the product and what the impact might be of the drivers of the product that become evident in the acceptability investigation. Within the frame of health care, this data is for example important, as many of these non-restorative enhancements might be delivered through the health care system [25,58] making them part of the discourse around health consumerism and consumer personalized medicine [164,165,166,167].

5.2. Modified Technology Acceptance Models

The original TAM has been modified by many. Some changes where linked to the field of application such as adding compatibility as a factor to investigate the uptake of information systems by health professionals in Canada [133]. Some changes were linked to the place investigated; one study investigating factors impacting innovation in a product development organization found that organizational support, cost-effectiveness, system quality, organizational need, and functional effectiveness were predictors of uptake [168]. Some changes were linked to social groups such as one study that added internal and external motivation [169] when they investigated older people; the UTAUT2 model added gender as a parameter. Musa modified TAM to be more applicable to development purposes and added “the linkages between factors of national development (socioeconomic development) and technological infrastructure (as captured by accessibility to technology)” and captured also “individuals’ perceptions of the negative and positive impact factors” [170].

We submit that no one model in our table encompasses all the facets needed to investigate acceptance and consequences of various non-restorative enhancements; however, modified versions of models such as the UTAUT and consumer choice model and combinations of the ones we looked at could still be useful.

It is believed by some that the existing technology acceptance models are not able to drive design, but are used only to verify designs [171]. The applications we cover in this paper are still emerging and are still open to changes in trajectory as to how they will be implemented, how they are employed, and with what consequences. For the results to be able to drive design, we submit that open-ended questions should be used in acceptability investigation questionnaires to obtain a differentiated picture of what people think about the products, especially around social context, social factors, and internal and external motivations. For example in the case of marginalized populations such as disabled people, we need a way to ascertain their views not only within the framework of being a patient population but also within a framework of self-understanding where the disabled person does not perceive themselves as impaired [58]. We also have to understand who wants or does not want neuro or other forms of enhancements and why people want or do not want neuroenhancements, BMI or social robots. This knowledge is important for the product generators.

Some social robotic groups modified the UTAUT to include core measures that would fit their product, and BMI is seen to be in need of a more tailored evaluation [141,142,143]. We submit that human-centered designs [172,173,174] (or user centered design [142]) and a participatory design process [175,176], which involves co-designing with generative design tools [177], is one possible avenue that could be utilized more by researchers and engineers. The participatory design could start with a modified UTAUT with open-ended questions covering design, needs, motivation and social factor consideration. This is an especially good starting point for emerging products which are not readily available to the public (as in the case of the products highlighted in this article) to defuse the critique that the acceptability models are used only to verify design rather than drive it [171].

Open-ended questions covering design, needs, motivation and social factor consideration ascertain information on the designs wanted and needed, but also generate information on what makes the product desirable, where funding should be sourced from (for development and for the actual purchase down the road by the consumer), and the social factors and contexts influencing the product acceptance. This information will be useful for the design and the ethical, legal, economic, social and policy discourse around the product’s development and as such should be gathered as early as possible. While it is not feasible to consult with every potential user on the design of a product, developing and applying a well-rounded acceptability model can generate a better understanding of the social context around the acceptance or rejection of a product, in addition to helping verify or disprove a design.

6. Conclusions

We submit that, in general, all three discourses under-investigate the different facets of what makes people accept or reject a given product. There is space for improvement for social robotics, brain machine interfaces and neuro/cognitive enhancement discourses as to investigating the sentiments of people towards the emerging products.

Acknowledgments

This work was in part supported by a Social Sciences and Humanities Research Council (SSHRC) grant (GW and NB); a SSHRC (CURA) grant (GW and LD) and a University of Calgary Bridgefund (SY).

Conflict of Interest

The authors declare no conflict of interest.

References

  1. Sekiyama, K.; Fukuda, T. Toward Social Robotics. In Proceedings of AAAI 1997 Fall Symposium Series, Socially Intelligent Agents, Providence, Rhode Island, Cambridge, MA, USA, 8–10 November 1997; AAAI Press: Menlo Park, CA, USA, 1997; pp. 118–124. Available online: http://www.aaai.org/Papers/Symposia/Fall/1997/FS-97-02/FS97-02-028.pdf (accessed on 30 March 2013).
  2. Dautenhahn, K.; Billard, A. Studying Robot Social Cognition within A Developmental Psychology Framework. In Proceedings of the Third European Workshop on Advanced Mobile Robots (Eurobot’99), Zurich, Switzerland, 6–8 September 1999; IEEE: Palo Alto, CA, USA, 1999; pp. 187–194.
  3. Giron-Sierra, J.M.; Halawa, S.; Rodriguez-Sanchez, J.R.; Alcaide, S. A Social Robotics Experimental Project. In Proceedings of the 30th Annual Frontiers in Education Conference, Kansas City, MO, USA, 18–21 October 2000; pp. 1–18.
  4. Restivo, S. Bringing up and Booting up: Social Theory and the Emergence of Socially Intelligent Robots. In Proceedings of 2001 IEEE International Conference on Systems, Man and Cybernetics, Tuscon, AZ, USA, 7–10 October 2001; Institute of Electrical and Electronics Engineers Inc.: Palo Alto, CA, USA, 2001; pp. 2110–2117.
  5. Fong, T.; Nourbakhsh, I.; Dautenhahn, K. A survey of socially interactive robots. Robot. Auton. Syst. 2003, 42, 143–166, doi:10.1016/S0921-8890(02)00372-X.
  6. Saunders, J.; Nehaniv, C.L.; Dautenhahn, K. An Experimental Comparison of Imitation Paradigms Used in Social Robotics. In Proceedings of RO-MAN 2004—The 13th IEEE International Workshop on Robot and Human Interactive Communication, Kurashiki, Okayama, Japan, 20–22 September 2004; Institute of Electrical and Electronics Engineers Inc.: Palo Alto, CA, USA, 2004; pp. 691–696.
  7. Dautenhahn, K.; Woods, S.; Kaouri, C.; Walters, M.L.; Kheng, L.K.; Werry, I. What is A Robot Companion––Friend, Assistant or Butler? In Proceedings of 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 1192–1197.
  8. Sparrow, R.; Sparrow, L. In the hands of machines? The future of aged care. Minds Mach. 2006, 16, 141–161, doi:10.1007/s11023-006-9030-6.
  9. Tapus, A.; Mataric, M.J.; Scasselati, B. Socially assistive robotics [Grand challenges of robotics]. IEEE Robot. Automat. 2007, 14, 35–42.
  10. Turkle, S. Authenticity in the age of digital companions. Interact. Stud. 2007, 8, 501–517.
  11. Ham, J.; Bokhorst, R.; Cuijpers, R.; van der Pol, D.; Cabibihan, J.J. Making Robots Persuasive: The Influence of Combining Persuasive Strategies (Gazing and Gestures) by A Storytelling Robot on Its Persuasive Power. In Proceedings of the 3rd International Conference on Social Robotics, Amsterdam, The Netherlands, 24–25 November 2011; Springer Verlag: Heidelberg, Germany, 2011; pp. 71–83.
  12. Dougherty, E.G.; Scharfe, H. Initial Formation of Trust: Designing an Interaction with Geminoid-DK to Promote a Positive Attitude for Cooperation. In Proceedings of the 3rd International Conference on Social Robotics, Amsterdam, The Netherlands, 24–25 November 2011; Springer-Verlag: Heidelberg, Germany, 2011; pp. 95–103.
  13. Fink, J.; Bauwens, V.; Mubin, O.; Kaplan, F.; Dillenbourg, P. People’s Perception of Domestic Service Robots: Same Household, Same Opinion? In Proceedings of the 3rd International Conference on Social Robotics, Amsterdam, The Netherlands, 24–25 November 2011; Springer-Verlag: Heidelberg, Germany, 2011; pp. 204–213.
  14. Ferguson, M.; Webb, N.; Strzalkowski, T. Nelson: A Low-Cost Social Robot for Research and Education. In Proceedings of the 42nd ACM Technical Symposium on Computer Science Education, Dallas, TX, USA, 9–12 March 2011; Association for Computing Machinery: New York, NY, USA, 2011; pp. 225–229.
  15. Gruebler, A.; Berenz, V.; Suzuki, K. Coaching Robot Behavior Using Continuous Physiological Affective Feedback. In Proceedings of the 11th IEEE-RAS International Conference on Humanoid Robots (Humanoids), Bled, Slovenia, 26–28 October 2011; pp. 466–471.
  16. Prado, J.A.; Simplicio, C.; Lori, N.F.; Dias, J. Visuo-auditory multimodal emotional structure to improve human-robot-interaction. Int. J. Soc. Robot. 2012, 4, 29–51, doi:10.1007/s12369-011-0134-7.
  17. Donoghue, J.P. Bridging the brain to the world: A perspective on neural interface systems. Neuron 2008, 60, 511–521, doi:10.1016/j.neuron.2008.10.037.
  18. Brain Machine Interface: BMI (Cyborg Soldiers), 2008. Available online: http://www.staticbrain.com/archive/brain-machine-interface-bmi-cyborg-soldiers/ (accessed on 30 March 2013).
  19. Rudolph, A. Military: Brain machine could benefit millions. Nature 2003, 424, 369–369, doi:10.1038/424369b.
  20. Zeigler, B.P. The brain-machine disanalogy revisited. Biosystems 2002, 64, 127–140, doi:10.1016/S0303-2647(01)00181-2.
  21. Pratt School of Engineering Duke University Darpa to Support Development of Human Brain-Machine Interface, Available online: http://www.pratt.duke.edu/pratt_press/web.php?sid=4&iid=2 (accessed on 30 January 2013).
  22. Mussa-Ivaldi, F.A.; Miller, L.E. Brain-machine interfaces: Computational demands and clinical needs meet basic neuroscience. Trends Neurosci. 2003, 26, 329–334, doi:10.1016/S0166-2236(03)00121-8.
  23. Wetware the Status of Brain-Machine Interfaces, Available online: http://wetware.hjalli.com/000124.shtml (accessed on 30 January 2013).
  24. Patil, P.G.; Turner, D.A. The development of brain-machine interface neuroprosthetic devices. Neurotherapeutics 2008, 5, 137–146, doi:10.1016/j.nurt.2007.11.002.
  25. Bostrom, N.; Sandberg, A. Cognitive enhancement: Methods, ethics, regulatory challenges. Sci. Eng. Ethics 2009, 15, 311–341, doi:10.1007/s11948-009-9142-5.
  26. Yokoi, H. Cyborg (Brain-machine/computer interface). Adv. Robot. 2009, 23, 1451–1454, doi:10.1163/016918609X12469657764904.
  27. Guenther, F.H.; Brumberg, J.S.; Wright, E.J.; Nieto-Castanon, A.; Tourville, J.A.; Panko, M.; Law, R.; Siebert, S.A.; Bartels, J.L.; Andreasen, D.S. A wireless brain-machine interface for real-time speech synthesis. PLoS ONE 2009, 4, e8218, doi:10.1371/journal.pone.0008218.
  28. Menon, C.; de Negueruela, C.; Millán, J.R.; Tonet, O.; Carpi, F.; Broschart, M.; Ferrez, P.; Buttfield, A.; Tecchio, F.; Sepulveda, F. Prospects of brain-machine interfaces for space system control. Acta Astronaut. 2009, 64, 448–456, doi:10.1016/j.actaastro.2008.09.008.
  29. Lebedev, M.A.; Tate, A.J.; Hanson, T.L.; Li, Z.; O’Doherty, J.E.; Winans, J.A.; Ifft, P.J.; Zhuang, K.Z.; Fitzsimmons, N.A.; Schwarz, D.A. Future developments in brain-machine interface research. Clinics 2011, 66, 25–32, doi:10.1590/S1807-59322011001300004.
  30. Mahmoudi, B.; Sanchez, J.C. A symbiotic brain-machine interface through value-based decision making. PLoS ONE 2011, 6, e14760, doi:10.1371/journal.pone.0014760.
  31. Martin, A.R.; Sankar, T.; Lipsman, N.; Lozano, A.M. Brain-machine interfaces for motor control: A guide for neuroscience clinicians. Can. J. Neurol. Sci. 2012, 39, 11–22.
  32. Shyamkumar, P.; Oh, S.; Banerjee, N.; Varadan, V.K. A wearable remote brain machine interface using smartphones and the mobile network. Adv. Sci. Technol. 2013, 85, 11–16, doi:10.4028/www.scientific.net/AST.85.11.
  33. Tamburrini, G. Brain to computer communication: Ethical perspectives on interaction models. Neuroethics 2009, 2, 137–149, doi:10.1007/s12152-009-9040-1.
  34. Velliste, M.; Perel, S.; Spalding, M.C.; Whitford, A.S.; Schwartz, A.B. Cortical control of a prosthetic arm for self-feeding. Nature 2008, 453, 1098–1101, doi:10.1038/nature06996.
  35. Gilja, V.; Chestek, C.A.; Diester, I.; Henderson, J.M.; Deisseroth, K.; Shenoy, K.V. Challenges and opportunities for next-generation intracortically based neural prostheses. IEEE Trans. Biomed. Eng. 2011, 58, 1891–1899, doi:10.1109/TBME.2011.2107553.
  36. Mason, S.G.; Jackson, M.M.M.; Birch, G.E. A general framework for characterizing studies of brain interface technology. Ann. Biomed. Eng. 2005, 33, 1653–1670, doi:10.1007/s10439-005-7706-3.
  37. Wolpaw, J.R.; Birbaumer, N.; McFarland, D.J.; Pfurtscheller, G.; Vaughan, T.M. Brain-computer interfaces for communication and control. Clin. Neurophysiol. 2002, 113, 767–791, doi:10.1016/S1388-2457(02)00057-3.
  38. Bashirullah, R. Wireless implants. IEEE Microw. Mag. 2010, 11, 14–23, doi:10.1109/MMM.2010.938579.
  39. Clausen, J. Man, machine and in between. Nature 2009, 457, 1080–1081, doi:10.1038/4571080a.
  40. Clausen, J. Conceptual and ethical issues with brain-hardware interfaces. Curr. Opin. Psychiatry 2011, 24, 495–501.
  41. Diep, L.; Wolbring, G. Who needs to fit in? Who gets to stand out? Communication technologies including brain-machine interfaces revealed from the perspectives of special education school teachers through an ableism lens. Educ. Sci. 2013, 3, 30–49, doi:10.3390/educsci3010030.
  42. Lewens, T. The risks of progress: Precaution and the case of human enhancement. J. Risk Res. 2010, 13, 207–216, doi:10.1080/13669870903126242.
  43. Coenen, C.; Schuijff, M.; Smits, M.; Klaassen, P.; Hennen, L.; Rader, M.; Wolbring, G. Human Enhancement Study. Available online: http://www.europarl.europa.eu/RegData/etudes/etudes/join/2009/417483/IPOL-JOIN_ET(2009)417483_EN.pdf (accessed on 30 March 2013).
  44. Gunson, D. Cognitive enhancement, analogical reasoning and social justice. J. Int. Biotechnol. Law 2009, 6, 133–149, doi:10.1515/JIBL.2009.19.
  45. Buchanan, A. Moral status and human enhancement. Philos. Public Aff. 2009, 37, 346–381, doi:10.1111/j.1088-4963.2009.01166.x.
  46. Riis, J.; Simmons, J.P.; Goodwin, G.P. Preferences for enhancement pharmaceuticals: The reluctance to enhance fundamental traits. J. Consum. Res. 2008, 35, 495–508, doi:10.1086/588746.
  47. Beck, S. Enhancement as a legal challenge. J. Int. Biotechnol. Law 2007, 4, 75–81.
  48. Irish Council for Bioethics. Human Enhancement: Making People Better or Making Better People? Irish Council for Bioethics 2007, Available online: http://www.bioethics.ie/uploads/docs/Humanenh.pdf (accessed on 30 March 2013).
  49. Tomasini, F. Imagining human enhancement: Whose future, which rationality? Theor. Med. Bioeth. 2007, 28, 497–507, doi:10.1007/s11017-007-9055-8.
  50. Williams, A.E. Good, Better, Best: The Human Quest for Enhancement Summary Report of An Invitational Workshop Convened by the Scientific Freedom, Responsibility and Law Program American Association for the Advancement of Science 1–2 June 2006, Available online: http://www.aaas.org/spp/sfrl/projects/human_enhancement/pdfs/HESummaryReport.pdf (accessed on 30 March 2013).
  51. Rothman, S.R.D. The Pursuit of Perfection: The Promise and Perils of Medical Enhancement; Pantheon Books: New York, NY, USA, 2005.
  52. Baylis, F.; Robert, J.S. The inevitability of genetic enhancement technologies. Bioethics 2004, 18, 1–26.
  53. Caplan, A.E.C. Is it ethical to use enhancement technologies to make us better than well? PLoS Med. 2004, 1, e52, doi:10.1371/journal.pmed.0010052.
  54. Farah, M.; Illes, J.; Cook-Deegan, R.; Gardner, H.; Kandel, E.; King, P.; Parens, E.; Sahakian, B.; Wolpe, P.R. Neurocognitive enhancement: What can we do and what should we do? Nat. Rev. Neurosci. 2004, 5, 421–425.
  55. Khushf, G. Systems theory and the ethics of human enhancement—A framework for NBIC convergence. Ann. N.Y. Acad. Sci. 2004, 1013, 124–149, doi:10.1196/annals.1305.007.
  56. Brodey, W.M.; Lindgren, N. Human enhancement––Beyond machine age. IEEE Spectr. 1968, 5, 79–93, doi:10.1109/MSPEC.1968.5214775.
  57. President’s Council on Bioethics. Beyond Therapy: Biotechnology and the Pursuit of Happiness; US Government: Washington, DC, USA, 2003.
  58. Wolbring, G. HTA Initiative #23: The Triangle of Enhancement Medicine, Disabled People, and the Concept of Health: A New Challenge for HTA, Health Research, and Health Policy; Alberta Heritage Foundation for Medical Research (AHFMR): Edmonton, AB, Canada, 2005.
  59. Wolbring, G. The Unenhanced Underclass. In Better Humans? The Politics of Human Enhancement; Wilsdon, J.M.P., Ed.; Demos Institute: London, UK, 2006.
  60. Wolbring, G. Why NBIC? Why human performance enhancement? Innov. Eur. J. Soc. Sci. Res. 2008, 21, 25–40, doi:10.1080/13511610802002189.
  61. Wolbring, G. Ableism, Enhancement Medicine and the Techno Poor Disabled. In Unnatural Selection: The Challenges of Engineering Tomorrow’s People; Healey, P., Rayner, S., Eds.; Earthscan-Routledge: Florence, SC, USA, 2008.
  62. Wolbring, G. Nanotechnology and the Transhumanization of Health, Medicine, and Rehabilitation. In Controversies in Science and Technology Volume 3: From Evolution to Energy; Kleinmann, D.L., Delborne, J., Cloud-Hansen, K., Handelsman, J., Eds.; Mary Ann Liebert: New Rochelle, NY, USA, 2010; pp. 290–303.
  63. Savulescu, J. New breeds of humans: The moral obligation to enhance. Reprod. Biomed. Online 2005, 10, 36–39, doi:10.1016/S1472-6483(10)62202-X.
  64. Savulescu, J.; Kahane, G. The moral obligation to create children with the best chance of the best life. Bioethics 2009, 23, 274–290, doi:10.1111/j.1467-8519.2008.00687.x.
  65. Harris, J. Enhancing Evolution: The Ethical Case for Making Better People; Princeton University Press: Princeton, NJ, USA, 2007.
  66. Harris, J. Enhancing Evolution: The Ethical Case for Making Better People (New in Paper); Princeton University Press: Princeton, NJ, USA, 2010.
  67. Harris, J. Taking the “Human” out of human rights. Camb. Q. Healthc. Ethics 2011, 20, 9–20, doi:10.1017/S0963180109990570.
  68. Harris, J. Sparrows, hedgehogs and castrati: Reflections on gender and enhancement. J. Med. Ethics 2011, 37, 262–266.
  69. Forlini, C. Examining Discourses on the Ethics and Public Understanding of Cognitive Enhancement with Methylphenidate. Ph.D. Thesis, University of Montreal, Montreal, QC, Canada, 2009.
  70. Racine, E.; Forlini, C. Expectations regarding cognitive enhancement create substantial challenges. J. Med. Ethics 2009, 35, 469–470, doi:10.1136/jme.2009.030460.
  71. Bostrom, N.; Roache, R. Smart policy: Cognitive enhancement and the public interest. Contemp. Read. Law Soc. Justice 2010, 2, 68–84.
  72. Outram, S.M.; Racine, E. Developing public health approaches to cognitive enhancement: An analysis of current reports. Public Health Ethics 2011, 4, doi:10.1093/phe/phr006.
  73. Partridge, B.J.; Bell, S.K.; Lucke, J.C.; Yeates, S.; Hall, W.D. Smart drugs “As common as coffee”: Media hype about neuroenhancement. PLoS ONE 2011, 6, e28416, doi:10.1371/journal.pone.0028416.
  74. Franke, A.G.; Bonertz, C.; Christmann, M.; Engeser, S.; Lieb, K. Attitudes toward cognitive enhancement in users and nonusers of stimulants for cognitive enhancement: A pilot study. AJOB Prim. Res. 2012, 3, 48–57.
  75. Sarewitz, D.; Karas, T.H. Policy Implications of Technologies for Cognitive Enhancement. In Neurotechnology: Premises, Potential, and Problems; Giordano, J., Ed.; CRC Press: Boca Raton, FL, USA, 2012; pp. 267–285.
  76. Lucke, J.C. Empirical research on attitudes toward cognitive enhancement is essential to inform policy and practice guidelines. AJOB Prim. Res. 2012, 3, 58–60, doi:10.1080/21507716.2011.645268.
  77. Wolbring, G. Is there an end to out-able? Is there an end to the rat race for abilities? J. Media Cult. 2008, 11. Available online: http://journal.media-culture.org.au/index.php/mcjournal/article/viewArticle/57 (accessed 30 March 2013).
  78. Wolbring, G. Therapeutic, enhancement enabling, assistive devices and the UN Convention on the rights of persons with disabilities: A missing lens in the enhancement regulation discourse. J. Int. Biotechnol. Law 2009, 6, 193–206.
  79. Wolbring, G. Therapeutic enhancements and the view of rehabilitation educators. DILEMATA Int. J. Appl. Ethics 2012, 8, 169–183.
  80. Lyon, R.H. A Sound Guide to Product Acceptance, Available online: http://www.aip.org/tip/INPHFA/vol-4/iss-1/p50.pdf (accessed on 30 March 2013).
  81. Frewer, L.; Scholderer, J.; Lambert, N. Consumer acceptance of functional foods: Issues for the future. Br. Food J. 2003, 105, 714–731, doi:10.1108/00070700310506263.
  82. Verbeke, W.; Vanhonacker, F.; Frewer, L.J.; Sioen, I.; de Henauw, S.; van Camp, J. Communicating risks and benefits from fish consumption: Impact on Belgian consumers’ perception and intention to eat fish. Risk Anal. 2008, 28, 951–967.
  83. Frewer, L.J.; Howard, C.; Shepherd, R. Genetic engineering and food: What determines consumer acceptance? Br. Food J. 1995, 97, 31–36, doi:10.1108/00070709510100118.
  84. Greenhalgh, T.; Robert, G.; Bate, P.; Macfarlane, F.; Kyriakidou, O. Diffusion of Innovations in Health Service Organisations; Blackwell Publishing Ltd.: Mississauga, ON, Canada, 2005.
  85. Caselli, F.; Ventura, J. A representative consumer theory of distribution. Am. Econ. Rev. 2000, 90, 909–926, doi:10.1257/aer.90.4.909.
  86. Diewert, W.E. Hedonic Regressions. A Consumer Theory Approach; University of Chicago Press: Chicago, IL, USA, 2003.
  87. Kronenberg, T. Finding common ground between ecological economics and post-Keynesian economics. Ecol. Econ. 2010, 69, 1488–1494, doi:10.1016/j.ecolecon.2010.03.002.
  88. Gualerzi, D. Growth Theory, Structural Dynamics and the Analysis of Consumption. In Structural Dynamics and Economic Growth; Arena, R., Porta, P.L., Eds.; Cambridge University Press: Cambridge, UK, 2012; pp. 181–203.
  89. Mahajan, V. Models for Innovation Diffusion; Sage Publications, Inc: Thousand Oaks, CA, USA, 1985; Volume 48.
  90. Sultan, F.; Farley, J.U.; Lehmann, D.R. A meta-analysis of applications of diffusion models. J. Mark. Res. 1990, 27, 70–77, doi:10.2307/3172552.
  91. Lee, T.T. Nurses adoption of technology: Application of Rogers innovation-diffusion model. Appl. Nurs. Res. 2004, 17, 231–238.
  92. Rogers, E.M. Diffusion of Innovations; Free Press: New York, NY, USA, 1995.
  93. Fishbein, M. A theory of reasoned action: Some applications and implications. Nebr. Symp. Motiv. 1980, 27, 65–116.
  94. Sheppard, B.H.; Hartwick, J.; Warshaw, P.R. The theory of reasoned action: A meta-analysis of past research with recommendations for modifications and future research. J. Consum. Res. 1988, 15, 325–343.
  95. Millstein, S.G. Utility of the theories of reasoned action and planned behavior for predicting physician behavior: A prospective analysis. Health Psychol. 1996, 15, 398–402, doi:10.1037/0278-6133.15.5.398.
  96. Hausenblas, H.A.; Carron, A.V.; Mack, D.E. Application of the theories of reasoned action and planned behavior to exercise behavior: A meta-analysis. J. Sport Exerc. Psychol. 1997, 19, 36–51.
  97. Chang, M.K. Predicting unethical behavior: A comparison of the theory of reasoned action and the theory of planned behavior. J. Bus. Ethics 1998, 17, 1825–1834, doi:10.1023/A:1005721401993.
  98. Belleau, B.D.; Summers, T.A.; Xu, Y.; Pinel, R. Theory of reasoned action. Cloth. Text. Res. J. 2007, 25, 244–257, doi:10.1177/0887302X07302768.
  99. Jaccard, J. The reasoned action model directions for future research. Ann. Amer. Acad. Polit. Soc. Sci. 2012, 640, 58–80, doi:10.1177/0002716211426097.
  100. Vermeir, I.; Verbeke, W. Sustainable food consumption among young adults in Belgium: Theory of planned behaviour and the role of confidence and values. Ecol. Econ. 2008, 64, 542–553, doi:10.1016/j.ecolecon.2007.03.007.
  101. Barker, M.; Swift, J.A. The application of psychological theory to nutrition behaviour change. Proc. Nutr. Soc. 2009, 68, 205–209, doi:10.1017/S0029665109001177.
  102. Kasper, J.; Koepke, S.; Fischer, K.; Schaeffler, N.; Backhus, I.; Solari, A.; Heesen, C. Applying the theory of planned behaviour to multiple sclerosis patients decisions on disease modifying therapy questionnaire concept and validation. BMC Med. Inform. Decis. Mak. 2012, 12, doi:10.1186/1472-6947-12-60.
  103. Cote, F.; Gagnon, J.; Houme, P.K.; Abdeljelil, A.B.; Gagnon, M.P. Using the theory of planned behaviour to predict nurses’ intention to integrate research evidence into clinical decision-making. J. Adv. Nurs. 2012, 10, 2289–2298.
  104. Bandura, A. Health promotion from the perspective of social cognitive theory. Psychol. Health 1998, 13, 623–649, doi:10.1080/08870449808407422.
  105. Alkire, S. Subjective quantitative studies of human agency. Soc. Indic. Res. 2005, 74, 217–260, doi:10.1007/s11205-005-6525-0.
  106. Yoo, S.J.; Han, S.H.; Huang, W.H. The roles of intrinsic motivators and extrinsic motivators in promoting e-learning in the workplace: A case from South Korea. Comput. Hum. Behav. 2012, 28, 942–950, doi:10.1016/j.chb.2011.12.015.
  107. Jang, Y.; Yoo, H. Self-management programs based on the Social Cognitive Theory for Koreans with chronic diseases: A systematic review. Contemp. Nurse 2012, 40, 147–159, doi:10.5172/conu.2012.40.2.147.
  108. Bandura, A. Social cognitive theory in cultural context. Appl. Psychol. 2002, 51, 269–290, doi:10.1111/1464-0597.00092.
  109. Williams, G.C.; Deci, E.L. Internalization of biopsychosocial values by medical students: A test of self-determination theory. J. Pers. Soc. Psychol. 1996, 70, 767–779, doi:10.1037/0022-3514.70.4.767.
  110. Ryan, R.M.; Deci, E.L. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am. Psychol. 2000, 55, 68–78, doi:10.1037/0003-066X.55.1.68.
  111. Deci, E.L.; Ryan, R.M. Self-determination theory: A macrotheory of human motivation, development, and health. Can. Psychol. 2008, 49, 182–185, doi:10.1037/a0012801.
  112. Burton, D.; Gillham, A.D.; Hammermeister, J. Competitive engineering: Structural climate modifications to enhance youth athletes’ competitive experience. Int. J. Sports Sci. Coach. 2011, 6, 201–218, doi:10.1260/1747-9541.6.2.201.
  113. Kapp, S.K. Navajo and autism: The beauty of harmony. Disabil. Soc. 2011, 26, 583–595, doi:10.1080/09687599.2011.589192.
  114. Deci, E.L.; Ryan, R.M. Self-determination theory in health care and its relations to motivational interviewing: A few comments. Int. J. Behav. Nutr. Phys. Act. 2012, 9, doi:10.1186/1479-5868-9-24.
  115. Ng, J.Y.Y.; Ntoumanis, N.; Thøgersen-Ntoumani, C.; Deci, E.L.; Ryan, R.M.; Duda, J.L.; Williams, G.C. Self-determination theory applied to health contexts: A meta-analysis. Perspect. Psychol. Sci. 2012, 7, 325–340, doi:10.1177/1745691612447309.
  116. Deci, E.L.; Ryan, R.M. Motivation, Personality, and Development within Embedded Social Contexts: An Overview of Self-Determination Theory. In The Oxford Handbook of Human Motivation; Ryan, R.M., Ed.; Oxford University Press: New York, NY, USA, 2012; pp. 85–107.
  117. Teixeira, P.J.; Carraça, E.V.; Markland, D.; Silva, M.N.; Ryan, R.M. Exercise, physical activity, and self-determination theory: A systematic review. Int. J. Behav. Nutr. Phys. Act. 2012, 9, doi:10.1186/1479-5868-9-78.
  118. Davis, F.D. A Technology Acceptance Model for Empirically Testing New End-User Information Systems: Theory and Results; Massachusetts Institute of Technology: Boston, MA, USA, 1985.
  119. Mathieson, K. Predicting user intentions: Comparing the technology acceptance model with the theory of planned behavior. Inf. Syst. Res. 1991, 2, 173–191, doi:10.1287/isre.2.3.173.
  120. Gefen, D.; Straub, D.W. Gender differences in the perception and use of e-mail: An extension to the technology acceptance model. MIS. Q. 1997, 21, 389–400, doi:10.2307/249720.
  121. Venkatesh, V.; Brown, S.A. A longitudinal investigation of personal computers in homes: Adoption determinants and emerging challenges 408. MIS. Q. 2001, 25, 71–102, doi:10.2307/3250959.
  122. Pavlou, P.A. Consumer acceptance of electronic commerce: Integrating trust and risk with the technology acceptance model. Int. J. Electron. Commer. 2003, 7, 101–134.
  123. King, W.R.; He, J. A meta-analysis of the technology acceptance model. Inform. Management 2006, 43, 740–755, doi:10.1016/j.im.2006.05.003.
  124. Terrizzi, S.; Sherer, S.; Meyerhoefer, C.; Scheinberg, M.; Levick, D. Extending the technology acceptance model in healthcare: Identifying the role of trust and shared information. AMCIS Proc. 2012. Paper 19.
  125. Belanche, D.; Casalo, L.V.; Flavian, C. Integrating trust and personal values into the Technology Acceptance Model: The case of e-government services adoption. Cuad. Econ. Dir. Empres. 2012, 15, 192–204.
  126. Chang, S.H. The impacts of consumer variety-seeking, interaction of demand and technology acceptance model on self-service technology in baby boomers. Mc.S. Thesis, Ming Chuan University, Taipei, Taiwan, 2012.
  127. Oshlyansky, L.; Cairns, P.; Thimbleby, H. Validating the Unified Theory of Acceptance and Use of Technology (UTAUT) Tool Cross-Culturally. In Proceedings of HCI 2007 the 21st British HCI Group Annual Conference, Lancaster, UK, 3–7 September 2007; British Computer Society: London, UK, 2007; pp. 83–86.
  128. Im, I.; Kim, Y.; Han, H.J. The effects of perceived risk and technology type on users’ acceptance of technologies. Inform. Management 2008, 45, 1–9, doi:10.1016/j.im.2007.03.005.
  129. Van Schaik, P. Unified theory of acceptance and use for websites used by students in higher education. J. Educ. Comput. Res. 2009, 40, 229–257, doi:10.2190/EC.40.2.e.
  130. Im, I.; Hong, S.; Kang, M.S. An international comparison of technology adoption Testing the UTAUT model. Inform. Management 2011, 48, 1–8, doi:10.1016/j.im.2010.09.001.
  131. Wang, Y.Y.; Townsend, A.; Luse, A.; Mennecke, B. The determinants of acceptance of recommender systems: Applying the UTAUT model. AMCIS Proc. 2012. Paper 2.
  132. Kidd, T.; Davis, T. A Framework to Analyze Faculty Involvement in Online Teaching Using UTAUT and Dewey’s Theory of Experience. In Proceedings of Society for Information Technology & Teacher Education International Conference; Resta, P., Ed.; AACE: Chesapeake, VA, USA, 2012; pp. 505–510.
  133. Ifinedo, P. Technology Acceptance by Health Professionals in Canada: An Analysis with a Modified UTAUT Model. In Proceeding of the 45th Hawaii International Conference on System Science, Maui, HI, USA, 4–7 January 2012; pp. 2937–2946.
  134. Oye, N.D.; Iahad, A.; Rahim, N. The history of UTAUT model and its impact on ICT acceptance and usage by academicians. Educ. Inf. Technol. 2012, doi:10.1007/s10639-012-9189-9.
  135. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS. Q. 1989, 13, 319–340, doi:10.2307/249008.
  136. BenMessaoud, C.; Kharrazi, H.; MacDorman, K.F. Facilitators and barriers to adopting robotic-assisted surgery: Contextualizing the unified theory of acceptance and use of technology. PloS ONE 2011, 6, e16395, doi:10.1371/journal.pone.0016395.
  137. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D.; DeLone, W.; McLean, E.; Jarvis, C.B.; MacKenzie, S.B.; Podsakoff, P.M.; Chin, W.W. User acceptance of information technology: Toward a unified view. Inform. Management 2003, 27, 425–478.
  138. Venkatesh, V.; Thong, J.Y.L.; Xu, X. Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS. Q. 2012, 36, 157–178.
  139. Atif, A.; Richards, D. A technology acceptance model for unit guide information systems. PACIS Proc. 2012. Paper 97.
  140. Yergens, D.R.J.; Doig, C.J. KSv2: Application for Enhancing Scoping and Systematic Reviews. In Proceeidngs of American Medical Informatics Association (AMIA) 2012 Annual Symposium, Chicago, IL, USA, 3–7 November 2012.
  141. Gürkök, H.; Plass-Oude Bos, D.; Laar, B.; Nijboer, F.; Nijholt, A. User experience evaluation in BCI: Filling the gap. Int. J. Bioelectromagn. 2011, 13, 54–55.
  142. Plass-Oude Bos, D.; Gürkök, H.; van de Laar, B.; Nijboer, F.; Nijholt, A. User Experience Evaluation in BCI: Mind the Gap! Int. J. Bioelectromagn. 2011, 13, 48–49.
  143. Laar, B.; Nijboer, F.; Gürkök, H.; Plass-Oude Bos, D.; Nijholt, A. User experience evaluation in BCI: Bridge the gap. Int. J. Bioelectromagn. 2011, 13, 157–158.
  144. Heerink, M.; Kröse, B.; Evers, V.; Wielinga, B. Assessing acceptance of assistive social agent technology by older adults: The almere model. Int. J. Soc. Robot. 2010, 2, 361–375, doi:10.1007/s12369-010-0068-5.
  145. De Ruyter, B.; Saini, P.; Markopoulos, P.; van Breemen, A. Assessing the effects of building social intelligence in a robotic interface for the home. Interact. Comput. 2005, 17, 522–541, doi:10.1016/j.intcom.2005.03.003.
  146. Salvini, P.; Laschi, C.; Dario, P. Design for acceptability: Improving robots’ coexistence in human society. Int. J. Soc. Robot. 2010, 2, 451–460, doi:10.1007/s12369-010-0079-2.
  147. Young, J.E.; JaYoung, S.; Voida, A.; Sharlin, E.; Igarashi, T.; Christensen, H.I.; Grinter, R.E. Evaluating human-robot interaction: Focusing on the holistic interaction experience. Int. J. Soc. Robot. 2011, 3, 53–67, doi:10.1007/s12369-010-0081-8.
  148. Broadbent, E.; Stafford, R.; MacDonald, B. Acceptance of healthcare robots for the older population: Review and future directions. Int. J. Soc. Robot. 2009, 1, 319–330, doi:10.1007/s12369-009-0030-6.
  149. Mackenzie, R.; Watts, J. Robots, social networking sites and multi-user games: Using new and existing assistive technologies to promote human flourishing. Tizard Learn. Disabil. Rev. 2011, 16, 38–47, doi:10.1108/13595471111185756.
  150. Young, J.E.; Hawkins, R.; Sharlin, E.; Igarashi, T. Toward acceptable domestic robots: Applying insights from social psychology. Int. J. Soc. Robot. 2009, 1, 95–108, doi:10.1007/s12369-008-0006-y.
  151. Dai, C.-Y.; Jang, J.-J.; Lee, T.-H.; Chen, Y.-T.; Yuan, Y.-H. Base on Human-Computer Interaction Perspective to Analyze the Factors of Technology Acceptance Model on IRSSP for Taiwan Recommendatory Admission. In Proceedings of the 6th International Conference on Computer Science & Education, Singapore, 3–5 August 2011; IEEE: Paolo Alto, CA, USA, 2011; pp. 149–153.
  152. Mason, S.G.; Bashashati, A.; Fatourechi, M.; Navarro, K.F.; Birch, G.E. A comprehensive survey of brain interface technology designs. Ann. Biomed. Eng. 2007, 35, 137–169, doi:10.1007/s10439-006-9170-0.
  153. McCullagh, P.J.; Ware, M.; Mulvenna, M.; Lightbody, G.; Nugent, C.D.; McAllister, H.G. Can brain computer interfaces become practical assistive devices in the community? Stud. Health Technol. Inform. 2010, 160, 314–318.
  154. Garipelli, G.; Galan, F.; Chavarriaga, R.; Ferrez, P.W.; Lew, E.; Millan, R. The Use of Brain-Computer Interfacing in Ambient Intelligence. In Constructing Ambient Intelligence; Springer-Heidelberg: Berlin, Germany, 2008; pp. 268–285.
  155. Ziefle, M.; Schaar, A.K. Gender differences in acceptance and attitudes towards an invasive medical stent. Electron. J. Health Inform. 2011, 6, e13:1–e13:18.
  156. Carpenter, J.; Davis, J.M.; Erwin-Stewart, N.; Lee, T.R.; Bransford, J.D.; Vye, N. Gender representation and humanoid robots designed for domestic use. Int. J. Soc. Robot. 2009, 1, 261–265, doi:10.1007/s12369-009-0016-4.
  157. Hegel, F.; Muhl, C.; Wrede, B.; Hielscher-Fastabend, M.; Sagerer, G. Understanding Social Robots. In Proceedings of the Second International Conferences on Advances in Computer Human Interactions, Cancun, Mexico, 1–7 February 2009; pp. 169–174.
  158. Moon, A.J.; Danielson, P.; van der Loos, H.F.M. Survey-based discussions on morally contentious applications of interactive robotics. Int. J. Soc. Robot. 2012, 4, 77–96.
  159. Marcos, S.; Gomez-Garcia-Bermejo, J.; Zalama, E. A realistic, virtual head for human-computer interaction. Interact. Comput. 2010, 22, 176–192, doi:10.1016/j.intcom.2009.12.002.
  160. Welch, K.C.; Lahiri, U.; Warren, Z.; Sarkar, N. An approach to the design of socially acceptable robots for children with autism spectrum disorders. Int. J. Soc. Robot. 2010, 2, 391–403, doi:10.1007/s12369-010-0063-x.
  161. Park, E.; del Pobil, A.P. Users’ attitudes toward service robots in South Korea. Ind. Robot 2013, 40, 77–87, doi:10.1108/01439911311294273.
  162. Qianli, X.; Ng, J.; Cheong, Y.L.; Tan, O.; Wong, J.B.; Tay, T.C.; Park, T. The Role of Social Context in Human-Robot Interaction. In Proceedings of 2012 Southeast Asian Network of Ergonomics Societies Conference, Langkawi, Malaysia, 9–12 July 2012; pp. 1–5.
  163. Donovan, R.J.; Egger, G.; Kapernick, V.; Mendoza, J. A conceptual framework for achieving performance enhancing drug compliance in sport. Sports Med. 2002, 32, 269–284, doi:10.2165/00007256-200232040-00005.
  164. Bloss, C.S.; Ornowski, L.; Silver, E.; Cargill, M.; Vanier, V.; Schork, N.J.; Topol, E.J. Consumer perceptions of direct-to-consumer personalized genomic risk assessments. Genet. Med. 2010, 12, 556–566, doi:10.1097/GIM.0b013e3181eb51c6.
  165. Guttmacher, A.E.; McGuire, A.L.; Ponder, B.; Stefansson, K. Personalized genomic information: Preparing for the future of genetic medicine. Nat. Rev. Genet. 2010, 11, 161–165.
  166. Kato, K.; Kano, K.; Shirai, T. Science communication: Significance for genome-based personalized medicineûa view from the Asia-Pacific. Curr. Pharm. 2010, 8, 93–96.
  167. Keller, M.A.; Gordon, E.S.; Stack, C.B.; Gharani, N.; Sill, C.J.; Schmidlen, T.J.; Joseph, M.; Pallies, J.; Gerry, N.P.; Christman, M.F. Coriell Personalized Medicine Collaborative®: A prospective study of the utility of personalized medicine. Pers. Med. 2010, 7, 301–317, doi:10.2217/pme.10.13.
  168. Boone, R.G.; Gordon, J.; Barnes, F.; Fraser-Beekman, S. Factors Impacting Innovation in a Product Development Organization. In Proceedings of 2012 IEEE International Conference on Electro/Information Technology (EIT), Indianapolis, IN, USA, 6–8 May 2012; pp. 1–11.
  169. Conci, M.; Pianesi, F.; Zancanaro, M. Useful, Social and Enjoyable: Mobile Phone Adoption by Older People. In Human-Computer Interaction––INTERACT 2009; Springer: Uppsala, Sweden, 2009; pp. 63–76.
  170. Musa, P.F. Making a case for modifying the technology acceptance model to account for limited accessibility in developing countries. Inf. Technol. Dev. 2006, 12, 213–224, doi:10.1002/itdj.20043.
  171. Salovaara, A.; Tamminen, S. Acceptance or Appropriation? A Design-oriented Critique of Technology Acceptance Models. In Future Interaction Design II; Springer: Heidelberg, Germany, 2009; pp. 157–173.
  172. Totter, A.; Bonaldi, D.; Majoe, D. A human-Centered Approach to the Design and Evaluation of Wearable Sensors-Framework and Case Study. In Proceedings of the 6th International Conference on Pervasive Computing and Applications, Port Elizabeth, South Africa, 26–28 October 2011; IEEE: Palo Alto, CA, USA, 2011; pp. 233–241.
  173. Ziefle, M.; Rocker, C. Human-Centered Design of E-Health Technologies: Concepts, Methods and Applications; IGI Global: Hershey, PA, USA, 2011.
  174. Van Velsen, L.; van Der Geest, T.; Klaassen, R.; Steehouder, M. User-centered evaluation of adaptive and adaptable systems: A literature review. Knowl. Eng. Rev. 2008, 23, 261–281.
  175. Millen, L.; Cobb, S.; Patel, H. Participatory design approach with children with autism. Int. J. Disabil. Hum. Dev. 2011, 10, 289–294.
  176. Alper, M.; Hourcade, J.P.; Gilutz, S. Interactive Technologies for Children with Special Needs. In Proceedings of the 11th International Conference on Interaction Design and Children, Bremen, Germany, 12–15 June 2012; ACM: New York, NY, USA, 2012; pp. 363–366.
  177. Hussain, S.; Sanders, E.B.-N. Fusion of horizons: Co-designing with Cambodian children who have prosthetic legs, using generative design tools. CoDesign 2012, 8, 43–79, doi:10.1080/15710882.2011.637113.
Technologies EISSN 2227-7080 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert