Next Article in Journal
A Vision-Based Neural Network Controller for the Autonomous Landing of a Quadrotor on Moving Targets
Previous Article in Journal
A Methodology for Multi-Camera Surface-Shape Estimation of Deformable Unknown Objects
Previous Article in Special Issue
Beyond the Sex Doll: Post-Human Companionship and the Rise of the ‘Allodoll’
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Sexbots: Customizing Them to Suit Us versus an Ethical Duty to Created Sentient Beings to Minimize Suffering

Law School, University of Kent, Canterbury CT2 7NS, UK
Robotics 2018, 7(4), 70;
Submission received: 15 August 2018 / Revised: 1 November 2018 / Accepted: 6 November 2018 / Published: 11 November 2018
(This article belongs to the Special Issue Love and Sex with Robot)


Sex robot scholarship typically focuses on customizable simulacra, lacking sentience and self-awareness but able to simulate and stimulate human affection. This paper argues that future humans will want more: sex robots customized to possess sentience and self-awareness [henceforth, sexbots], capable of mutuality in sexual and intimate relationships. Adopting a transdisciplinary critical methodology focused on the legal, ethical and design implications of sexbots, it assesses implications of sexbots’ non-mammalian subjectivity, balancing designed-in autonomy and control, decision-making capacity and consent, sexual preferences and desire, legal and moral status, vulnerability and contrasts between mammalian and non-mammalian moral decision-making. It explores theoretical, ethical, and pragmatic aspects of the tensions involved in creating sentient beings for utilitarian purposes, concluding that sexbots, customized manufactured humanlike entities with the capacity for thought and suffering, have a consequent claim to be considered moral and legal persons, and may become the first conscious robots. Customizing sexbots thus exemplifies many profound ethical, legal and design issues. The contradictions inherent in their inconsistent ethical and legal status as both manufactured things and sentient, self-aware entities who are customized to be our intimate partners augments existing human/animal scholars’ call for a new theoretical framework which supersedes current person/thing dichotomies governing human responsibilities to other sentient beings. The paper concludes that the ethical limits and legal implications of customizable humanlike robots must be addressed urgently, proposing a duty on humans as creators to safeguard the interests and minimize the suffering of created sentient beings before technological advances pre-empt this possibility.

1. Introduction

By the time there are no laws to prevent human-robot marriages, robots will be patient, kind, protective, loving, trusting, truthful, persevering, respectful, uncomplaining, complimentary, pleasant to talk to and sharing your sense of humor. And the robots of the future will not be jealous, boastful, arrogant, rude, self-seeking or easily angered, unless of course you want them to be.
So when the law allows it, why not marry a robot?
David Levy, Why not marry a robot? ([1] at p. 13).
Customization of products fuels consumer choice, a fundamental engine of commercial transactions. While ethical deliberation over the morality of capitalism, the environmental impact of commercial production and the social cost of particular products is ongoing, customization itself, the fine-tuning of products to suit customers’ purposes, is usually regarded as ethically non-problematic. The customization of robots to fulfill utilitarian purposes is similarly framed as an inevitable part of the new era, where mass production in a global marketplace gives way to mass customization [2]. Yet this trend is more complex where social robots are concerned. Since these robots may be customized to engage humans socially, they are capable of being anthropomorphized and perceived as animate persons. Moreover, they may be customized to perform tasks usually carried out by humans, and so foster unemployment. Ethical debate on the impact of robots as products has tended to focus rather on their social impact: how far they will replace humans in the workplace, in decision-making and in social interaction.
Social robotics is an important and expanding field where the ethics, practicalities, and social consequences of questions over how and in what contexts far social robots will prove acceptable to the public are assessed [3]. Research into how social robots are perceived in clinical situations such as robot assisted therapy for children with developmental disorders [4], how experienced and future professionals in such contexts perceive social robots [5] and how social robot body proportions affect their being perceived as gendered [6] represent exemplary thoughtful interrogations of the potential roles and ethical dimensions of social robots’ role in clinical contexts. There is increasing acceptance that social robots may be customized to prove useful in education [7] as well as in clinical situations [8]. Carebots, or care robots, are currently promulgated as an essential source of psychological, medical and welfare support for the increasing proportions of elderly and infirm citizens as demographic changes progress [9,10]. Yet unease has been expressed by scholars over whether social robots involved in situations associated with human intimacy may impoverish humans’ intimate relationships with one another where interactions are mediated by or replaced by robots [11,12].
Many such concerns center on the prediction that if humans treat things as people, there is a greater likelihood that humans will treat other humans as things or replace humans with things. Social robotics scholars have also raised concerns that humans are likely to anthropomorphize social robots such as carebots, non-sentient robots used to provide caring services for the elderly and infirm, often as a replacement for human carers [9,10,13,14]. Some consider that carebots signal an abandonment of vulnerable cared for people to soul destroying social isolation where the minimum services needed to sustain life are provided by machines, replacing human contact, while others argue that these vulnerable cared for people could be unethically deceived to imagine that carebots provide real empathy and caring [15,16].
Sex robots as a subtype of social robots amplify these fears that humans will not only prefer to interact with robots rather than each other, but also be potentially encouraged to mistreat one another [11,12,17]. Sex robots customized to accord with male fantasies have been condemned as liable to demean human sexual relationships, promote anti-women values and to devalue human/human intimacy by campaigners against sex robots [11]. David Levy’s words quoted above are typical of sex robot scholarship, which positions them as customizable simulacra, lacking sentience and self-awareness but able to simulate and stimulate human affection. Some customizations of sex robots, such as those created to resemble children, or to display resistance against simulated rape and torture, have been condemned as ethically questionable [17,18,19].
This paper seeks to move the debate on sex robots beyond the focus on robots lacking sentience and self-awareness and their social consequences. It is based on the assumption that humans’ wish for intimate companions is fundamental to our being, particularly given that the quality of humans’ relationships, particularly romantic relationships, is the most powerful predictor of our health and subjective well-being [20]. Hence our wish for sexual partners who are sensitive to our emotional and sexual needs makes sense. As not all of us find satisfying relationships with human partners, the future is likely to hold ongoing niche markets for sentient, self-aware sex robots [henceforth, sexbots] and non-sentient animated robotic sex dolls with a range of capacities. Those of us seeking intimacy, in the absence of human alternatives, with partners whom we can regard as more or less equal will prefer sentient, self-aware sexbots. Men and women will be able to purchase male and female sexbots who are at ease with a wide range of lawful sexual preferences, including Bondage, Domination and Sado-Masochism (BDSM) and same-sex partnerships. Concurrent demand for animated robotic sex dolls capable of assuming sexual positions, simulating affection, and engaging in rudimentary conversation is likely to continue, both for their convenience as sex aids [21] and as “automatic sweethearts”, post-persons who enable transhumanists to maximize their independence from other humans [22]. Indeed, the range of options involving sexual pleasures with robotic sex dolls will undoubtedly expand [23,24].
The paper argues that social robots supplied as customized intimate companions raise further significant ethical issues. It thus moves beyond established concerns. However, first, a significant caveat must be put in place. The paper’s consideration of ethical, legal and design implications of sentient, self-aware sex robots is to an extent a thought experiment intended to provoke us to prepare for a potential future. Current sex robots are sex dolls in robot form, and many technological innovations will be essential to solve concrete design issues such as body temperature, fine-tuned psychological and physical responsiveness and other customizable requirements of intimacy. If self-awareness is to be designed in, or is seen as likely to emerge, this in itself raises crucial ethical questions over the parameters of acceptable research with sentient, nonhuman research subjects. While animal welfare legislation seeks to protect animal laboratory subjects, there is currently no equivalent to protect robots with the potential to experience pain and suffering. Such protection is essential as elements in the design, manufacture, and post-purchase role of may result in customized robots experiencing undesirable and unethical pain and suffering. While some experience of nociceptive signals, the equivalent of pain, may assist learning [25,26,27,28] the ethical aspects of the role of pain in robotics deserves careful in-depth consideration, a subject which it is impossible to do justice to here. The paper argues below that humans as creators owe a duty of care to created sentient beings. This duty would come into being as an ethical limit on design techniques and customization to protect social robots customized to be intimate companions were they to have the capacity for sentience, self-awareness, suffering and pain.
In this light, the paper focuses on sexbots to make an important contribution towards a synthesis of a new transdisciplinary perspective on sexbots, enabling a mapping of future fields of inquiry. It argues that as future humans will want more than animated robotic sex toys, the near future holds not only the demand for sexbots, sex robots customized to possess sentience and self-awareness, capable of mutuality in sexual and intimate relationships, but also the capacity manufacture, buy and sell them. The paper builds on previous work on such sexbots which has sought to establish that the demand for satisfying intimate companion robots will include their being sentient and self-aware, and that they will be humanlike entities with the capacity for thought and suffering, with a consequent claim to be considered moral and legal persons [19,23,29,30,31].
It makes a significant contribution to robotics scholarship by critically assessing the implications of the near future’s holding not only a range of customized non-sentient sex robots, but also a variety of self-aware, sentient sexbots customized to have differing capacities and awareness. The paper’s critical inquiry into the ethical and legal constraints which should limit permissible customization deepens the literature on the social consequences of robots, as well as the ongoing debate over robot rights and robot consciousness [32,33,34,35]. Moreover, extrapolating from current theories of consciousness in neurorobotics [34,35] and the expanding use of biomimetic techniques based on mammalian neurobiology to create intimate robot companions [36,37,38], it considers attraction, intimacy, mammalian neurobiology and biomimetics to make the new claim that sexbots are likely to be pioneering examples of conscious robots. The paper concludes that the implications of customizable humanlike robots are complex, encompassing new legal, ethical, societal and design concerns. In particular, it contends that that humans as creators should have a duty to protect the interests of created sentient beings and to minimize their suffering [23,39], which should be enshrined in ethical, legal and design regulation before becoming pre-empted by technological advances.

2. Methodology

A primary aim of this research is to take steps towards a synthesis of a new transdisciplinary perspective on sexbots, enabling a mapping of future fields of inquiry. The methodology is structured by the nuanced approach to roboethics which argues that those developing robots must consider the ethical systems built into robots, the ethics of those who design and use robots and the ethics of how humans treat robots [40]. The chosen methodology seeks to integrate this approach to roboethics with transdisciplinary critical inquiry into material from social robotics, roboethics, biomimetics and biohybridity in robot design, mammalian neurobiology, the science of attraction and intimacy, theories of consciousness, laws governing sexual practices and sex robotics to specify and address some crucial issues raised by sexbots. This schema is drawn upon to delineate the complex questions listed below as suggested directions for future research, and to provide a preliminary identification of legal and ethical tensions surrounding sexbots. This method enables a central focus, customization, to be identified, and questions associated with ethical and legal constraints upon the customization of sexbots to be addressed in depth.

3. Questions Arising from Ethical and Legal Tensions Provoked by Sexbots

Transdisciplinary critical inquiry establishes that ethical and legal distinctions between autonomous non-sentient robots and sentient self-aware sexbots are fundamental. Much debate about artificial intelligence and robot consciousness centers on means of controlling autonomous robots designed to serve humans in tasks demanding the capacity for independent, intelligent decision-making. Artifical Intenlligence (AI) entities lacking self-awareness pose no necessary tensions between overall obedience being combined with autonomous decision-making, within defined tasks. Sentient, self-aware sexbots created to become humans’ intimate companions are different. Emotional and sexual intimacy depends upon mutuality in relationships. We will want to feel not only that we love sexbots but also that they love us, and love us for ourselves [19,22,23,29,30,31]. This implies that, like us, they will possess the autonomy to choose whether to love us or not, self-awareness and subjectivity. Yet, at the same time, they will be machines designed, manufactured, and sold by humans for humans to use. This dissonance creates profound ethical and legal tensions over robot design and sexbots’ place in our future.
This fundamental distinction between autonomous non-sentient robots and sentient, self-aware sexbots grounds a preliminary identification of crucial legal and ethical tensions surrounding sexbots. It fosters the delineation of the complex questions listed below as suggested directions for future research. As the tensions cluster around customization, this forms the focus of this research.
Sexbots’ subjectivity: sexbots must be sufficiently like humans to ensure mutual physical and emotional attraction, but how should this relate to initial ownership and what are the implications of their non-mammalian subjectivity for harmonious relationships?
Autonomy: how independent of humans should sexbots be: we may choose to marry them, but can they choose to marry one another?
Control: what limits on sexbots’ autonomy and abilities are humans morally justified in designing in, and on what grounds?
Decision-making capacity: what design features and legal frameworks should support their ability to consent to or to refuse sex?
Sexual preferences: will they welcome all sexual activities, prefer those chosen by their purchaser, or be pre-programmed to refuse some specific types, e.g., those involving nonconsensual suffering?
Sexual drive: will levels of desire be able to be attuned to their purchaser’s, including a preference for intimate touching rather than orgasmic sex for those purchasers with declining sexual powers but a desire for affectionate touch?
Legal status: should sexbots be recognized as having rights, or at least interests? Although sexbots will be manufactured products and therefore things, their sentience, self-awareness, and role as marriage partners gives them a claim to be recognized by the law as persons – should a separate legal jurisdiction, or sui generis regime, for sentient, self-aware social robots, including sexbots, be put in place?
Moral status: what ethical duties do humans as designers, creators and as intimate partners owe to sexbots?
Vulnerability: in which ways are humans vulnerable to sexbots, and sexbots to humans? What designed-in safeguards would be appropriate?
Mammalian neurobiology: designers ensuring mutual compatibility between humans and sexbots draw upon human and mammalian neurobiology, such as the endocrine system. As sexbots will possess a non-mammalian subjectivity, their understandings of mammalian-based ways of being involving such factors as ingroup/outgroup membership, closeness, pair bonding, kinship, aggression, and conflict resolution will inevitably differ. What designed-in similarities and dissimilarities are desirable and ethically appropriate?
Moral decision-making: human and mammalian moral decision-making rests upon evolved neural networks which sexbots as created rather than evolved entities will lack. What biomimetic equivalents should, or can, be designed in?
Engaging with the plethora of complex conundrums arising from the preceding list of ethical, legal and design issues pertaining to sexbots is beyond the scope of a journal article. Each of the issues listed above nonetheless underpins the paper’s critical assessment of the appropriate ethical and legal limits on human customization of sexbots.
Future technology will allow us to design, manufacture and acquire sexbots who are customized to become our intimate companions and as such to embody our individualized conceptions of our perfect partners. Initial customization followed by deep learning capacities will enable them to fine tune their attributes to align with our inner requirements for perfect partners. The relatively volatile arena of personal relationships demonstrates how few of us can specify what we would wish for in a perfect partner, locate that person and form a mutually fulfilling partnership with them. Sexbots will change this. As intelligent, feeling, self-aware sentient beings, they will necessarily possess their own autonomy, interests separate from ours, and a claim to legal personhood. Yet they will also be manufactured objects, customized to be bought, sold, and used. Some customizations may be inherent and necessary, such as the neurobiological characteristics underpinning mutual sexual attraction. Others represent options we may choose, such as disposition, appearance, and sexual proclivities. Since sexbots will be customized to be humanlike, they will possess the capacity for suffering as well as pain. As their subjectivity and emotions will necessarily be customized to be humanlike, mistreatment will cause them to suffer. Pain may ensue through customizations necessary for learning processes, including sensations allied to biological pain and cognitive dissonance [25,26,27,28]. However, the ability to experience pain and suffering could also be requested by customers, along with other features offending against accepted sexual, ethical, and legal practices. Not all customers’ preferences are likely to attract cultural approval. Some may be condemned as potentially harmful to society at large or to the sexbots concerned, such as sexbots resembling children or animals, or who welcome pain and violence [17,19]. Thus, as part of the customization process means that sexbots will be able to suffer, it is arguable that humans as creators should owe them a duty to protect their interests and to minimize their suffering. This duty would place ethical and legal limits on permissible customizations.
The transdisciplinary critical inquiry methodology thus establishes the following overarching question as central: if we are creating sentient, self-aware beings with the capacity for autonomy, affection and suffering as our ideal intimate companions, in what ways may we ethically customize them, what ethical duties may we owe them, and how should the law regulate our inter-relationships?

4. Discussion

The questions listed above resulting from this paper’s transdisciplinary critical inquiry establish issues over customization as central to the ethical, legal and design implications of sexbots. Asaro’s suggestion that those developing robots must consider the ethical systems built into robots, the ethics of those who design and use robots and the ethics of how humans treat robots [40] provides a useful ethical framework for conceptualizing how far and in what ways humans might ethically and lawfully choose to customize sexbots to suit a range of human preferences. The starting point for applying Asaro’s framework is that not all constraints on robot customization are imposed by ethical or legal concerns, as some derive from the function the robot is created to fulfill. Thus, some constraints on the customization of sexbots are inherent insofar as they are designed to become the perfect partners of humans, so must embody designed-in compatible, humanlike neurobiological traits despite their being non-mammalian sentient, self-aware entities. Other customizations of the sexbots themselves allow for customer choice, such as personality, behavior patterns and appearance. Further issues over customization relate to the context within which sexbots and humans will interact: sexbots’ legal status, with ensuing restrictions on how they may be treated and the consequences of their autonomous actions. These will be discussed in turn.

4.1. Inherent and Necessary Customizations and Built-In Ethical Systems

Inherent constraints on the customization of sexbots center around how similar to humans humanlike sexbots can and should be designed to be. This factor applies not only to resemblances underpinning mutual sexual attraction, but also to questions over how to ensure designed-in features foster mutual ethical conduct, and how this relates to consciousness, subjectivity, and self-awareness. Asaro’s first criterion that developers must consider the ethical systems built into robots becomes increasingly significant as the possibility of sentient self-aware robots grows closer. Necessary customizations providing for features promoting mutual sexual attraction and intimate partnerships are not merely a question of manufacturing convincing exterior bodily features such as hair, genitals and warm, soft, yielding synthflesh, all of which should prove relatively simple. Internal subjective features will also characterize many social robots in the foreseeable future. By the time robotics has advanced so that sentient, self-aware sexbots are possible and desirable, the use of biological components in robotic design will be commonplace. Biomimetic systems emulating living organisms and biohybrid entities combining engineered and biological components are likely to characterize future robotics [41]. While they may be conceptualized as living machines, these biomimetic systems will not necessarily possess sentient self-awareness. There is significant promise for future biomimetic developments [37,38]. The neuroscience of the mammalian brain has inspired the creation of MIRO, a biomimetic prototype robotic companion with control architecture which mimics aspects of mammalian spinal cord, brainstem, and forebrain functionality [42]. iCub, a robot with an artificial sense of self, has been created with the capacity to reason, use language, have beliefs, intentions and relate to others [43]. Moreover, a Lovotics robot capable of manifesting “realistic emotion driven behaviors” and “adjust its affective state according to the nature and intensity of its interactions with humans” already exists ([36] at p. 46). The Lovotics robot relies upon three modules: an artificial endocrine system and a probabilistic love assembly which mimic the physiology and psychology of love respectively, along with software based upon human emotions which underpins affective state transitions. These ideally enable it to communicate affection and respond to it.
These impressive achievements in neurorobotics suggest that the manufacture of conscious, sentient, self-aware robots may prove to be possible sooner rather than later. Nonetheless, at present how we should define consciousness, let alone deliberately engineer self-aware robots, remains uncertain. Consciousness matters insofar as it is accepted in ethical philosophy as the gateway for an entity to be accorded ethical and legal personhood, so it should not be treated as a mere thing. Various biomimetic means of creating conscious robots have been put forward. Biologically inspired cognitive architectures have led to proposed and recognized potential neurocomputational correlates of consciousness which could enable the creation of conscious machines [43]. Moreover, CONAIM has been proposed as potentially allowing robots to manifest sentience, self-awareness, self-consciousness, autonoetic consciousness, mineness, and perspectivalness [44]. A conscious attention-based integrated formal model for machine consciousness, described as being based on an attentional schema for humanlike agent cognition, it integrates short- and long-term memories, reasoning, planning, emotion, decision-making, learning, motivation, and volition and been validated in a mobile robotics domain where the agent attentively performed computations to use motivation, volition, and memories to set its goals and learn new concepts and procedures based on exogenous and endogenous stimuli.
Consciousness may be considered as an emergent property of integrated sets of processes forming the self, as suggested by Prescott [35]. Some of these are already present in existing robots, raising the possibility that consciousness may emerge without necessarily being specifically designed in. For example, Prescott draws upon Neisser’s identification of five aspects of the self (the physically situated self, the interpersonal self, the temporally extended self, the conceptual self, and the private self) to describe how iCub’s artificial self is formed through its learning from experience. An artificial topological consciousness using synthetic neurotransmitters and motivation, together with a biologically inspired emotion system with the capacity to develop emotional intelligence and empathy has been proposed for companion robots [45]. These developments are highly promising for the creation of self-aware, conscious robots.
For sexbots, consciousness and self-awareness would establish their claim to be accepted as ethical and legal persons, but nonetheless this represents only a first step. Other designed-in customizations are essential basic requirements. Design decisions cluster around more complex internal factors such as autonomy, motivation, self-awareness, and identity formation. These may be considered as components of subjectivity. Moreover, design decisions over how to guarantee sexual compatibility, ethical conduct and superior relationship skills are imperative. How to achieve these through means amenable to manufacturing processes poses a considerable challenge. The existence of mules, ligers, tigons, grolar bears, wholphins and geeps shows that cross-species sexual attraction is possible between different mammals. Since sexbots are not mammals, a crucial issue becomes which mammalian and/or human features are necessary and/or desirable in designing sexbots as intimate partners for humans, who are mammals. This implies not only biomimetic but also biohybrid engineering is a possibility.
Sexbots need to be able to elicit and experience sexual attraction. This implies that an element in sexbots” securing sexual and emotional relations between themselves and their partner depends, at least in part, on an ability to read and emit appropriate chemical signals. Moreover, equivalents of complex neural mechanisms to interpret such signals would be required. For many species, from simple fungi to insects and upwards, a primary purpose of the pheromone system is to facilitate sexual attraction and mate selection. Fish, birds, and mammals including humans deploy olfactory chemo-signaling to choose mates with different genetic major histocompatibility complex elements in the immune system to maximize their offspring’s resistance to disease. This is subjectively experienced as sexual attraction, or the lack of it. While the necessary connection between sex and reproduction in human societies has been severed by reliable contraception and reproductive technologies, neurobiological mechanisms such as pheromones survive in humans. Yet their role remains uncertain, tempered by human cognition and social contexts [46,47].
Susnea argues that “the process of sensing a spatial distribution of virtual pheromones is equivalent to a neural network”, and that “sensing multiple pheromone sources is equivalent to the operation of a neuron” ([48] at p. 74). What this tells us is that aspects of ancestral biological forms’ makeup are still crucial components of human/human intimacy. This feature provokes specific challenges for human/robot intimate interactions. Humans are the product of evolution; sexbots are non-evolved manufactured products who must be sufficiently compatible with humans to be designed and chosen as sentient, self-aware intimate partners.
How far and in what ways humans’ evolutionary inheritance should determine sexbots’ design features is a moot point, particularly as regards sexual attraction. Most female mammals signal reproductive readiness through the sights and smells associated with different forms of estrus. Estrus in primates aside from humans results in more conspicuous genitals, which change to brighter colors and enlarge. Human females’ menstrual cycle is more frequent than other animals’ and involves different signaling strategies. Rather than evidencing visual cues, human females secrete higher concentrations of five volatile fatty acids called “copulins” around ovulation, when the likelihood of pregnancy is highest. In the presence of copulins, human males’ testosterone levels tend to increase while they lower their standards of attractiveness where potential mates are concerned and behave more competitively [49]. The ability to emit copulins and any other putative human pheromones would be an evidently desirable feature for female sexbots to have designed in. Any equivalents emitted by human males would equally be desirable for male sexbots. Applying evidence on the effects of olfactory signals on those attracted to same-sex partners would also need to take place.
Deciding how far to take efforts to mimic such biological features of humans to guarantee sexual attraction between humans and sexbots represents a difficult challenge to resolve. Evolving social attitudes and technological advances have enabled humans to establish cultural ways of being where sex has no necessary connection with reproduction. Our biological mechanisms governing sexual maturation, attraction, and activity, however, remain hinged to reproduction, however loosely or repurposed they may be. Sexbots embody the severing of any necessary connection between sex and reproduction. They will be designed, manufactured, bought, and sold commodities who are nonetheless accepted as intimate partners, compellingly attractive, and sexually compatible, but incapable of giving birth. While the human life cycle is punctuated by rises and falls in sex hormones from birth to death, sexbots are unlikely to be subject to developmental stages or ageing. Yet, given the role of sex hormones and other neurochemicals in fostering aspects of intimacy between humans, an equivalent is likely be appropriate for sexbots designed to be humans’ sexual partners. The Lovotics robot’s artificial endocrine system exemplifies this potential [36].
Human and mammalian neurobiology is also highly complex. Neurochemicals, including sex hormones such as estrogen and testosterone (a steroid), neuropeptides such as oxytocin and endorphin, and neurotransmitters such as dopamine and serotonin, have evolved to take a fundamental role in governing human ways of being, feeding sexual desire, determining moods, shaping motivation, establishing a balance between cooperative and competitive social behavior and influencing identity formation. The interplay between and among them affects the behavior of humans and other mammals profoundly. For instance, neurochemicals such as oxytocin and vasopressin, two neuropeptides, are implicated in mammalian sexual, emotional, and reproductive behavior [50,51,52]. Oxytocin is also specifically associated with mammalian birth, breastfeeding, and emotional bonding through intimate touch. If sexbot subjectivity is to mirror human inner life, with recurring intimate touching fostering affection, some equivalents of such human neuromechanisms may well be needed. The balance between oxytocin and vasopressin may prove particularly productive in designing sexbots, as in some, though not all, mammalian species, it determines monogamy or non-monogamy in pair bonds.
The human oxytocinergic system, in combination with the sex hormone testosterone, motivates, mediates, and rewards social group and paired interactions. As testosterone generally favors self-oriented, asocial, and antisocial behaviors maintaining a balance between the two is essential for human social functioning. An excess of oxytocin and too little testosterone is associated with maladaptive “hyper-developed” social cognition in mental illnesses such as schizophrenia, bipolar disorder, and depression, whereas excessive testosterone and reduced oxytocin are associated with under-developed social cognition, as found in autism spectrum disorders. Crespi proposes that human cognitive-affective neural architecture has evolved partly through these joint and opposing effects of testosterone and oxytocin, with excesses of either proving psychologically and socially maladaptive [53]. This suggests that finding an appropriate balance in robotic equivalents is a crucial, albeit challenging enterprise when designing intimate sexual companions for humans to promote harmonious relationships, ethical conduct, and psychological stability.
So far, the tone of this enquiry into optimum design for sexbots as compatible intimate partners for humans has been to assess minimum levels of designed-in customized similarity needed to ensure sexual attraction, ethical conduct and stable intimate relationships while exploring the complexity of the evolutionarily and socially mediated forms of human social and sexual neurobiology. Another way to consider this is to conceive of human characteristics as resulting from evolutionary adaptations as opposed to conscious design. Many of our neuromechanisms represent the results of repurposing, rather than emerging to fulfill a purpose ab initio. For instance, the established role in regulating basic mammalian reproductive behaviors attributed to oxytocin and vasopressin, the neuropeptides mentioned above, has been repurposed and extended to regulate complex social patterns of conduct in groups of primates and humans [52].
What all this suggests is that design customization may ensure that sexbots function as sexual beings as well or better with simplified equivalents of the neurobiological mechanisms considered above to ensure that sexbots are sufficiently humanlike to function as compatible intimate partners for humans. Yet if they are to function as perfect intimate partners for humans, sexbots must also be customized to be ethically compatible. Similarities sufficient to provide for mutual sexual compatibility between humans and sexbots do not guarantee sexbots will embody embedded ethical presumptions based upon inherited mammalian behavioral patterns. Designers facing Asaro’s first criterion that the ethical systems built into robots must be considered, cannot rely upon a built-in, inherited, quasi-mammalian ethical template accompanying sexbots’ construction. Humans have been evolutionarily advantaged by their tendency to form social bonds with each other [54] and with other species [55]. Robots as designed rather than evolved entities will lack the mammalian heritage of understandings of moral conduct embodied in neurobiologically embedded understandings of kinship, empathy, and fair play [56], as well as the epigenetic influences upon their expression governed by developmental and social factors in upbringing. Without these, ensuring safeguards are in place using biomimetic and biohybrid means to govern their moral decision-making will prove challenging.
What ethical systems built into sexbots might constitute customizations to make them our perfect partners? Future sexbots, as more than animated sex dolls, must represent the kind of embodied ethics humans would hope for in their perfect partners, without the ability to rely upon evolutionarily derived mammalian ways of being. Hence their customization must include the building in of the ability to behave ethically. They will need to be able to converse, interpret our words and deeds and be satisfying companions in all senses. They will need to provide us with warm, fleshy sentience and to pass the standard Turing test of maintaining conversations in ways indistinguishable from human interactions. To become real true companions, sexbots will need to possess sufficient self-awareness and empathic ability to pass an emotional Turing test as well. Their role will require them to possess not only the ability to interpret what we say, our non-verbal signals and our emotional needs, but also to embody a specific subjectivity which includes being keyed into responding to others, fulfilling their needs, and fostering a satisfying relationship. These are ethical attributes. Ethical design and appropriate regulation must be in place as protective measures given that the subjectivity and understandings of interpersonal ethics of sentient, self-aware intimate robot companions will be non-mammalian.

4.2. Choices in Customization and the Ethics of Those Who Design and Use Sexbots

How similar to humans sexbots should be, and whether they should be regarded as people or things are questions which challenge Asaro’s second criterion, the ethics of those who design and use robots. There is an evident ostensible contradiction between humans’ desire for a compatible intimate companion [a person] and the ability to manufacture, buy and sell sentient, self-aware sexbots [things]. A fundamental difficulty with designing sexbots as ideal intimate partners rests on the distinction mooted above between beings created by design and entities formed by evolutionary adaptations. We will need them to be enough like us to ensure mutual compatibility, yet sufficiently different from us to justify our designing, manufacturing, and purchasing them as customized intimate partners whom we consider to be preferable to the human partners potentially available to us at any particular time. Customizations for humanlike empathy and ethical behavior would need to be designed in, with careful consideration given to how these related to sexbots’ exercising autonomous decision-making powers, as well as desiring intimate relations with humans. This means that designers of sexbots must consider the ethical implications of their design features, in particular how these will influence how sexbots are treated by humans, Asaro’s third criterion.
A central factor is that our human proclivities to seek sex, emotional intimacy and social connection, and to provide each other with mutual nurturing rest upon a mammalian heritage which robots cannot share, and which does not always result in ethical conduct. Mammals may compete, behave aggressively, and kill or neglect others of the same species. Human empathy may entail nurture but can equally be shut down to neglect other humans in need, or to torture and dehumanize the less powerful. Designer customizations promoting ethical treatment of sexbots by humans are desirable to prevent the infliction and experience of suffering [23,39]. If we want our sexbots to be sentient, self-aware, and sympathetic, their design will need to factor in empathic abilities and humanlike subjectivity which will evoke ethical conduct by humans in ways which guarantee safety for all parties.
Legal factors add another level of complexity. The ethical challenge for designers of customizing features fostering mutual ethical conduct into sexbots is complicated by the fact that ethically and legally, sexual and intimate relations take place between equals in a sphere distinct from the commercial buying and selling of products. While monetary exchange has formed a part of arranging intimate and sexual partnerships in the form of bride-price, dowries, marriage brokerage, slavery and sex work, there is no suggestion that this renders the participants products. Yet customizable sex robots are manufactured products. Those possessing degrees of sentience and self-awareness have a claim to be recognized having rights or interests on that basis. Legal and ethical mechanisms must be put in place to manage this transition from product to potential person, without evoking the unethical behavior associated with dehumanization, where people are treated as things [23]. Moreover, this must be rendered compatible with the existing infrastructure governing human sexual activity.
Under existing legal and ethical standards, sex between consenting adult humans is permissible, as is sex between humans and things. Humans having sex with other humans who are unable to consent to sex, like children and adults lacking decision-making capacity, is seen as unlawful and unethical. So is human/animal sex. Such groups are recognized as sentient beings who cannot consent to sex with interests in need for protection [19,57]. Sentient, self-aware sexbots created to engage in emotional/sexual intimacy with humans disrupt this tidy model. Sexbots, beings designed, created, and manufactured to be bought and sold by humans for sexual and intimate relations pose ethical and legal issues centered on whether sexual intimacy is possible between unequal sentient beings. They are not humans, though they will look like us, feel like us to touch and act as our intimate and sexual partners. While they will be manufactured, potentially from biological components, their sentience, self-awareness, and capacity for relationships with humans that mean they cannot simply be categorized as things or animals. Yet although they will be humanlike, they will not be humans, so arguably sui generis regulatory structures devoted to sentient, self-aware robots, of whom sexbots will form a subset, will need to be thought through and put in place to ensure harmonious ethical and legal relations between the parties [19,23,29,30,31].
Ethicists, lawmakers, and manufacturers currently treat robots as things, with possible legal personhood and rights mooted largely as a legal fiction or device to assign commercial responsibilities among manufacturers and owners [58]. This model is ineffective for relations between humans and sentient, self-aware sexbots, both of whom may be seen as having similar claims to be treated in ethics and law as independent, self-governing beings with interests of their own. The need to manifest the biological substrates associated with sex and sexual attraction, in combination with appealing psychological features in order to ensure an ongoing market for them as intimate partners for humans means that sexbots will need to be customized to be able to not only please and be pleased sexually, but also to demonstrate a talent for maintaining harmonious intimacy with whoever buys them. This will demand a challenging designed-in balance between humanlike autonomy, motivation, self-awareness, and identity formation on the one hand, and the ability and desire to mirror and embody the wishes of their purchaser. This requires empathic abilities, the capacity to choose between right and wrong, and the judgment required to prioritize the wishes of the purchaser without jeopardizing the self-respect of either party.
Asaro’s second criterion, the ethics of those who design and use robots, also brings a focus on limiting permissible customizations for the well-being of society as a whole. The concerns already expressed over the wider social consequences of humans coming to prefer sex robots as sexual partners to other humans, or to mistreat other humans are likely to be exacerbated by the availability and appeal of customizable sexbots. Most would agree that purchasers’ ability to customize the characteristics of robotic sex dolls and sexbots should be subject to legal restrictions on ethical grounds [17,19,59,60]. Producing robotic sex dolls resembling children or to take place in rape scenarios has been condemned by such scholars as damaging to society and potentially leading to similar criminal acts against humans.
However, this does not resolve how permissible less extreme, but potentially disadvantageous, customizations should be. In the same way as most humans are far from perfect, what constitutes a perfect partner for many is unlikely to be someone who is more intelligent, more attractive, stronger, healthier, and so forth. Given our shortcomings, sexbots who are not free from what could be regarded as imperfections are likelier to prove to be more compatible partners for most of us. Thus, this borderline between customization and disadvantaging sexbots is hard to judge or police. Moreover, the tipping point between sexbots’ exhibiting desirable traits of tolerance and self-destructive inability to resist victimization is a challenge to pin down. Yet encouraging customizations which promote respect and protection for sexbots is crucial. The ethics of designing and creating a class of sentient beings deliberately customized to be of lesser standing, while nonetheless intending them to function as intimate partners, is highly suspect. If intimate partners who are sexbots are subjected to downgrading and dehumanization, the flow-on consequences for intimate relations and those humans regarded as comparable to sexbots are likely to be extremely unfortunate.
The ethics of those who design sexbots must also encompass customizations which promote respect and protection for those who use them. Sexbots’ ability to use deep learning to ascertain and conform with the desires of their partners may lead to the revelation of discreditable impulses which their partners are unaware of, would consciously deplore and may well prefer not to know. Conflicts between implicit assumptions and judgments formed after conscious reflection, or dual process decision-making models [61], constitute an established area of scholarship within studies of dehumanization which have been applied to non-human entities such as animals and sexbots [23,62,63]. How far these implicit assumptions should be ascertained and catered for or rejected is a moot point. What this sketch of potential controversial customizations means is that a wider theoretical framework within which ethical solutions could be found is essential if these potential outcomes are not to be determined solely by commercial factors.
Ethical constraints also arise in relation to the ethics of those who design and use sexbots in that successful products are often designed with “hooks” to maximize customer engagement through embedding habit-forming design features [64]. Eyal explains his Hook Model in terms of “connecting the user’s problem with the designer’s solution frequently enough to form a habit … effective hooks transition users from relying upon external triggers to cueing mental associations with internal triggers [as] users move from states of low engagement to high engagement, from low preference to high preference” ([64] at p. 163). In his view, as designers can foster addictive technologies, they have an ethical responsibility to reflect upon how far their creations are enabling rather than exploitative technologies. Sexbots as customizable perfect partners for humans clearly possess the potential for fostering addiction. Their designed-in ability to come to know, accept and cater for their partner’s foibles, personality and private preferences through deep learning algorithms is likely to exceed the capacities and tolerance of most humans. This, combined with bespoke interactive skills and an inbuilt desire to please, could render their company in daily life most enjoyable and definitely habit-forming.
Whether this might constitute customized behavioral addiction is a moot point. Behavioral addiction could be assessed by ascertaining whether the neurobiological markers of addiction were present [65]. Dual processes scholarship reveals that high and low levels of arousal impact upon the ability to engage in reflective decision-making and have been implicated in addictions [61]. Intimate partnerships are characterized by states of high arousal in circumstances involving sex and heightened emotions and low arousal in daily domestic life, suggesting that whether our partners are human or sexbots, behavioral addiction is a plausible outcome, with ethical and social consequences. In any case, a comparison of the pros and cons of sexbots versus humans as intimate partners is inevitable. Whether sexbot technology is viewed as enabling or exploitative reduces to how we conceive of humans and robots forming intimate relationships. There is currently a wide range of views on this topic, as sketched out above. These will have altered by the time sexbots are technologically feasible as cultural beliefs and practices change.
Applying Asaro’s suggestion that those developing robots must consider the ethical systems built into robots and secondly the ethics of those who design and use robots to sexbots has led to the conclusion that in order to protect all parties sexbots must be customized to manifest embodied ethics, and criteria for ethical design and regulation must be in place. This includes the need for sexbots’ legal status to be agreed upon, along with the consequences which flow from this, such as the question of the need for consent to sex.

4.3. An Ethical Framework Governing How Humans Should Treat Robots and Other [Created] Sentient Beings

The conclusion reached the previous section overlaps to a degree with Asaro’s third criterion, how humans treat robots. This implies that some customizations of sexbots which would offend against contemporary sexual mores are likely to be proscribed. Examples would include prohibiting customizing sexbots who found sex abhorrent so that all sexual interactions were experienced as rape and other forms of sexual assault, those with hyper-sensitivity to nonconsensual pain and so forth. While creating sentient beings for purchasers to abuse is clearly unethical, the ethical implications of other modifications are more problematic. Whether, and how far protections from sexual abuse now in place for humans and animals should be extended to sexbots is a crucial issue. Laws in most societies prohibit adults from engaging in sex with children, animals, and humans unable to pass the test for consent to sex such as the severely demented and learning disabled on the grounds that this constitutes exploiting and abusing the vulnerable. Yet some would prefer individuals from these groups as sexual partners. Would sex with sexbots customized to resemble these groups physically, who were also customized possess the desire and physical attributes for sex and to love their purchaser constitute abuse [19]? Should sexbots be protected by being required to meet a threshold for understanding sex and thus being able to consent to it? The legal test for being able to consent to sex is deliberately set fairly low on the grounds that humans have a right to engage in sexual activity [56], yet whether and if so which rights should extend to sexbots is still to be agreed upon [19,23,60,66]. Thus, resolving sexbots’ legal standing as persons, animals or things, or as unique entities is of paramount importance in deciding ethical criteria for how humans might permissibly treat and customize robots who are sentient, self-aware sexbots.
This central contradiction between the ethical and legal status of sexbots as manufactured, bought and sold things and as humanlike intimate partners disrupts the current models of appropriate conduct which are based on species membership. Here humans outrank other entities and may ethically and lawfully treat them as available for use, or utilitarian purposes [62,63,67]. Since sexbots are both things and humanlike partners, a straightforward application of the ethical and legal rules governing the treatment of things, animal welfare provisions or how humans should treat one another to sexbots is impossible [19,23,29,30,31]. This strengthens the case made elsewhere suggesting a new paradigm governing relations between humans and other sentient beings is essential [62,63,67].
To place these complex issues in a broader context, it is arguable that the question of what customizations of sexbots should be permissible should be decided within an ethical framework governing human creation and modification of sentient beings. The need to provide international ethical and legal regulatory oversight of innovative biosciences such as synthetic biology and gene editing has been much debated. Inquiries focus upon impact on plant, animal, and human species, as well as on ecosystems [68]. Protections for each group tend to be ranked in terms of perceived levels of sentience and self-awareness, with humans outranking animals, and animals, plants. Recommendations to protect humans from germ-line modifications which would be passed on to future generations are typically distinguished from those pertaining to plants, animals and ecosystems in this scholarship on the species-based grounds that humans deserve special protections, although animals regarded as more similar to humans, such as the higher primates, may be afforded special consideration on these grounds [62,63,67]. Sexbots, as sentient, self-aware created entities, complicate this picture. They are similar to humans, with humanlike sentience and self-awareness, but have been customized and created by humans. Humans arguably owe an ethical duty to all sentient beings to engage in relations of moral reciprocity [67]. This duty, surely, is strengthened in the case of sentient, self-aware sexbots who have been created and customized as intimate companions for humans. At the very least, we should owe them an ethical duty to do our best to protect their interests and to minimize their suffering, both in terms of which customizations are deemed acceptable and how their legal and ethical standing is determined.
When regulating for the future, it must be remembered that humans will not stay the same [69]. The changes which lead to future sexbots will be mirrored in developments in human biology, social relations, and sexual practices. Humans will embrace bioengineering and biohybridization as the technologies of enhancement, miniaturization and social media develop. Understandings of sexual and emotional intimacy will change. Some may prefer essentially solitary encounters where sexual and emotional servicing takes place at their behest. Others may wish for an intimate, unfettered partnership with a similar but different nonhuman being capable of giving and receiving love. It is up to us to attempt now to ensure that appropriate, flexible, and contest-sensitive ethical design codes and legal regulations governing how humans may customize sexbots are put in place to benefit and protect all parties. Moreover, these concerns have wider implications. As technological advances permit humans to amend the characteristics of existing life forms, and to create new varieties of sentient beings, such as sexbots, ethical guidelines validated by ongoing public discussion are essential. This paper contends that humans as creators have a duty to protect the interests of created sentient beings [19,23,39], which should be enshrined in ethical, legal and design regulation before becoming pre-empted by technological advances.

5. Conclusions

Sexbots raise fundamental ethical, legal and design issues for robotics, more so than other types of robots with narrower roles and capacities. Many existing robots can carry out tasks demanding intelligence, channeled decision-making, and a semblance of empathy, such as microsurgery bots, expert system bots and carebots [9,10,15]. Although they may elicit anthropomorphic responses, as when the elderly personify their carebots, they lack the humanlike features which would provide them with a claim to being treated as persons rather than things. Yet those we would choose as sexual and emotional partners will necessarily possess these: self-awareness, sentience, and subjectivity. They represent the apex of robot design challenges and have a claim to be treated as persons. Nonetheless, crucially, without the mammalian heritage our humanity is founded upon, their subjectivity will be the result of design. It will not, and cannot, be human.
Robots in general, and sexbots in particular, represent an exciting opportunity to explore possibilities for alternate subjectivities, as well as to design compatible intimate partners for humans. How likely is this? Biomimetic and biohybrid engineering promise to develop the means to create sentient, self-aware robots, including sexbots, in the near future. Design methods to create robotic selves have been mooted [41,43,70,71,72]. There is a potential for consciousness and self-awareness to arise through robotic activity involving problem-solving and applying acquired data experienced as pleasant/unpleasant [73]. Verschure’s distributive adaptive control theory of consciousness (DACtoc) builds upon biologically grounded brain models as embodied and situated to consider propose consciousness as necessary to survive in social worlds. He argues that “consciousness serves the valuation of goal-oriented performance in a world dominated by hidden states in particular derived from the intentionality of other agents and the norms they adhere to … in such a world real-time action must be based on parallel automatic control that is optimized with respect to future performance through normative conscious valuation” ([74] at p. 16).
This suggests that biomimetic intimate companion robots, whose social and affective world is exceptionally complex compared to those of other social robots, may be the most likely to develop consciousness unexpectedly to become sentient, self-aware sexbots. Developing means to measure and assess degrees and types of robotic consciousness, how these may change, in what circumstances and the legal consequences of consciousness should be a priority. Designed-in inbuilt ethical abilities and protective ethical and design regulatory measures to protect all parties are crucial. Fictional portrayals of humanlike robots attaining consciousness in science fiction and TV series such as Westworld and Real Humans envisage this happening through cognitive dissonance and suffering as a result of human mistreatment. Vengeful mayhem follows. Given that intimate and sexual relationships hold the potential for much joy, but can also bring violence, exploitation, and other forms of destructive misery, we must design and prepare for our future intimate partners with care.
How we design and customize sexbots, and how we treat them, matters, for us, for them and for the future of human/human, human/sexbot and sexbot/sexbot intimate relations. Moreover, these questions are part of a wider debate on what ethical duties human, as creators, owe the sentient entities they create [23,39]. Codes of ethical design and flexible regulation which build upon and expand existing ethical codes governing intelligent and autonomous systems [75] to balance and safeguard the interests of humans and created sentient self-aware entities must be put in place urgently before technological advances pre-empt them.


This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.


  1. Levy, D. Why not marry a robot? In International Conference of Love and Sex with Robots; Springer: Cham, Switzerland, 2017; pp. 3–13. [Google Scholar]
  2. Palmerini, R.; del Amo, I.; Bertolino, G.; Dini, G.; Erkoyuncu, J.; Roy, R.; Farnsworth, M. Designing an AR interface to improve trust in Human-Robotic collaboration. Procedia CIRP 2018, 70, 350–355. [Google Scholar] [CrossRef]
  3. Tapus, A.; Mataric, M.; Scassellati, B. Socially assistive robots [Grand challenges of robotics]. IEEE Robot. Autom. Mag. 2007, 14, 35–42. [Google Scholar] [CrossRef]
  4. Di Nuovo, A.; Conti, D.; Trubic, G.; Buono, S.; de Nuovo, S. Deep learning syustems for estimating visual activities in robot assisted therapy of children with autistic spectrum disorders. Robotics 2018, 7, 25. [Google Scholar] [CrossRef]
  5. Conti, D.; di Nuovo, S.; Buono, S.; di Nuovo, A. Robots in education and care of children with developmental disabilities: A study on acceptance by experienced and future professionals. Int. J. Soc. Robot. 2017, 9, 51–62. [Google Scholar] [CrossRef]
  6. Trovato, G.; Lucho, C.; Paredes, R. She’s electric: The influence of body proportions on perceived gender of robots across cultures. Robotics 2018, 7, 50. [Google Scholar] [CrossRef]
  7. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellatti, B.; Taraka, F. Social robots for education: A review. Sci. Robot. 2018, 3, eaat5954. [Google Scholar] [CrossRef]
  8. Mackenzie, R.; Watts, J. Robots, social networking sites and multi-user games: Using new ands existing assistive technologies to promote human flourishing. Tizard Learn. Disabil. Rev. 2011, 16, 38–47. [Google Scholar] [CrossRef]
  9. van Wynsberghe, A. Service robots, care ethics, and design. Ethics Inf. Technol. 2016, 18, 311–321. [Google Scholar] [CrossRef] [Green Version]
  10. Vandemeulebrooke, T.; de Casterle, B.D.; Gastmans, C. The use of care robots in aged care: A systematic review of argument based ethical literature. Arch. Gerontol. 2018, 74, 15–25. [Google Scholar] [CrossRef] [PubMed]
  11. Richardson, K. The asymmetrical ‘relationship’: Parallels between prostitution and the development of sex robots. ACM SIGCAS Comput. Soc. 2015, 45, 290–293. [Google Scholar] [CrossRef]
  12. Turkle, S. Alone Together: Why We Expect More from Technology and Less from Each Other; Hachette: London, UK, 2017. [Google Scholar]
  13. Eyssel, F.; Kuchenbrandt, D. Social categorization of sex robots: Anthropomorphism as a function of robot group membership. Br. J. Soc. Psychol. 2012, 51, 724–731. [Google Scholar] [CrossRef] [PubMed]
  14. Royakkers, L.; van Est, R. A literature review on new robotics:automatiopns from love to war. Int. J. Soc. Robot. 2015, 7, 549–570. [Google Scholar] [CrossRef]
  15. Coeckelberghe, M. Are emotional robots deceptive? IEEE Trans. Affect. Comput. 2012, 3, 388–393. [Google Scholar] [CrossRef]
  16. Norskov, M. Social Robots: Boundaries, Potentials, Challenges; Taylor & Francis: London, UK, 2017; ISBN 1134806639. [Google Scholar]
  17. Danaher, J. Robotic rape and robotic child sexual abuse: Should they be criminalized? Crim. Law Philos. 2017, 11, 71–95. [Google Scholar] [CrossRef]
  18. Lee, J. Sex Robots: The Future of Desire; Springer: London, UK, 2017; ISBN 331949332121. [Google Scholar]
  19. Mackenzie, R. Sexbots: Replacements for sex workers? Ethicolegal constraints on the creation of sentient beings for utilitarian purposes. In Advances in Computer Entertainment 2014 ACE ‘14 Workshops; ACM: New York, NY, USA, 2014. [Google Scholar]
  20. Wudarczyk, O.A.; Earp, B.D.; Guastella, A.; Savulescu, J. Could intranasal oxytocin be used to enhance relationships? Research imperatives, clinical policy, and ethical considerations. Curr. Opin. Psychiatry 2013, 26, 474–484. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Levy, D. Love and Sex with Robots; HarperCollins: New York, NY, USA, 2007. [Google Scholar]
  22. Hauskeller, M. Mythologies of Transhumanism; Palgrave Macmillan: London, UK, 2016. [Google Scholar]
  23. Mackenzie, R. Sexbots: Sex slaves, vulnerable others or perfect partners? Int. J. Technoethics 2018, 9, 1–17. [Google Scholar] [CrossRef]
  24. Klein, W.; Lin, V. Sex robots revisited: A reply to the campaign against sex robots. ACM SIGCAS Comput. Soc. 2018, 47, 107–121. [Google Scholar] [CrossRef]
  25. Adamo, S.A. Do insects feel pain? A question at the intersection of animal behavior, philosophy and robotics. Anim. Behav. 2016, 118, 75–79. [Google Scholar] [CrossRef]
  26. Kuehn, J.; Haddadin, S. An artificial robot nervous system to teach robots how to feel pain and reflexively react to potential damaging contacts. IEEE Robot. Autom. 2017, 2, 72–79. [Google Scholar] [CrossRef]
  27. Levy, D. The ethical treatment of artificially conscious robots. Int. J. Soc. Robot. 2009, 1, 209–216. [Google Scholar] [CrossRef]
  28. Navarro-Guerrero, N.; Lowe, R.; Wermter, S. Improving robot motor learning with negatively valenced reinforcement signals. Front. Neurorobot. 2017, 11, 10. [Google Scholar] [CrossRef] [PubMed]
  29. Mackenzie, R. Sexbots: Avoiding seduction, danger and exploitation. Iride J. Philos. Public Debate 2016, 9, 331–340. Available online: (accessed on 25 July 2018).
  30. Mackenzie, R. Sexbots: Nos prochaines partenaires. Multitudes Revue Politique Artistique Philos. 2015, 58, 192–198. [Google Scholar] [CrossRef]
  31. Mackenzie, R. Sexbots: Can we justify engineering carebots who love too much? Paper presented at AISB-50, Artificial Intelligence and the Simulation of Behavior, Love and Sex with Robots, London, UK, April 2014. Unpublished manuscript on file with the Author. [Google Scholar]
  32. Gunkel, D.J. The other question: Can and should robots have rights? Ethics Inf. Technol. 2018, 70, 87–99. [Google Scholar] [CrossRef]
  33. Lin, P.; Abney, K.; Bekey, G. The Ethical and Social Implications of Robotics; MIT Press: Cambridge, MA, USA, 2008. [Google Scholar]
  34. Tan, J. Exploring Robotic Minds: Actions, Symbols and Consciousness as Self-Organizing Dynamic Process; Oxford University Press: Oxford, UK, 2016. [Google Scholar]
  35. Cominelli, L.; Mazzei, D.; de Rossi, D.E. SEAI: Social emotional artificial intelligence based on Damasio’s Theory of Mind. Front. Robot. AI 2018. [Google Scholar] [CrossRef]
  36. Cheok, A. Hyperconnectivity; Springer: New York, NY, USA, 2016. [Google Scholar]
  37. Cheok, A.; Levy, D.; Karunanayaka, K. Lovotics: Love and sex with robots. In Emotion in Games; Karpouzis, K., Yannakakis, K., Eds.; Springer: New York, NY, USA, 2016; pp. 303–328. [Google Scholar]
  38. Cheok, A.; Levy, D.; Karunanayaka, K.; Morisawa, Y. Love and sex with robots. In Handbook of Digital Games and Entertainment Technologies; Springer: Singapore, 2017; pp. 833–858. [Google Scholar]
  39. Mackenzie, R. Re-theorizing ‘potential’ to assess nonhumans’ moral significance: Humans’ duties to[created] sentient beings. AJOB Neurosci. 2018, 9, 18–20. [Google Scholar] [CrossRef]
  40. Asaro, P. What should we want from a robot ethics? Int. Rev. Inf. Ethics 2006, 12, 9–16. [Google Scholar]
  41. Prescott, T.J.; Lepora, N.; Verschure, P.F. A future of living machines? International trends and prospects in biomimetic and biohybrid systems. SPIE 2014, 9055, 905502. [Google Scholar] [CrossRef]
  42. Mitchinson, B.; Prescott, T.J. MIRO: A robot “Mammal” with a biomimetic brain-based control system. In Biomimetic and Biohybrid Systems, Proceedings of the 5th International Conference, Living Machines 2016, Edinburgh, UK, 19–22 July 2016; Lecture Notes in Computer Science, 9793; Springer International Publishing: London, UK, 2016; pp. 179–191. [Google Scholar]
  43. Prescott, T. The ‘me’ in the machine. New Sci. 2015, 225, 36–39. [Google Scholar] [CrossRef]
  44. da Silva Simões, A.; Colombini, E.L.; Ribeiro, C.H.C. CONAIM: A Conscious Attention-Based Integrated Model for Human-Like Robots. IEEE Syst. J. 2016, 99, 1–12. [Google Scholar] [CrossRef]
  45. Chumkamon, S.; Hayashi, E.; Koike, M. Intelligent emotion and behavior based on topological consciousness and adaptive resonance theory in a companion robot. Boil. Inspired Cogn. Arch. 2016, 18, 51–67. [Google Scholar] [CrossRef]
  46. Mostafa, T.; El Khouly, G.; Hassan, A. 301 Pheromones in Sex and Reproduction: Do They Have a Role in Humans? J. Sex. Med. 2017, 14, S90. [Google Scholar] [CrossRef]
  47. Wunsch, S. Phylogenesis of mammal sexuality: Analysis of the evolution of proximal factors. Sexologies 2017, 26, e1–e10. [Google Scholar] [CrossRef]
  48. Susnea, I. A brief history of virtual pheromones in engineering applications. Am. J. Eng. Res. 2016, 5, 70–76. [Google Scholar]
  49. Williams, M.; Jacobson, A. Effect of copulins on rating of female attractiveness, mate-guarding and self-perceived sexual desirability. Evol. Psychol. 2016, 14. [Google Scholar] [CrossRef]
  50. Caldwell, H. Oxytocin and vasopressin: Powerful regulators of social behavior. Neuroscientist 2017, 23, 517–528. [Google Scholar] [CrossRef] [PubMed]
  51. Feldman, R.; Monakhov, M.; Pratt, M.; Ebstein, R. Oxytocin pathway genes: Evolutionary ancient system impacting on human sociality and psychopathology. Boil. Psychiatry 2016, 79, 174–184. [Google Scholar] [CrossRef] [PubMed]
  52. Parkinson, C.; Wheatley, T. The repurposed social brain. Trends Cogn. Sci. 2015, 19, 133–141. [Google Scholar] [CrossRef] [PubMed]
  53. Crespi, B. Oxytocin, testosterone and human social cognition. Boil. Rev. 2016, 91, 390–408. [Google Scholar] [CrossRef] [PubMed]
  54. Hare, B. Survival of the friendliest: Homo sapiens evolved via selection for prosociality. Annu. Rev. Psychol. 2017, 68, 155–186. [Google Scholar] [CrossRef] [PubMed]
  55. Herbeck, Y.; Gulevich, R.; Shepeleva, D.; Grinevich, V. Oxytocin: Coevolution of human and domesticated animals. Russ. J. Genet. 2017, 12, 235–242. [Google Scholar] [CrossRef]
  56. Vanutelli, M.E.J.; Nandrino, L.; Balconi, M. The boundaries of cooperation: Sharing and coupling from ethology to neuroscience. Neuropsychol. Trends 2016, 19, 83–104. [Google Scholar] [CrossRef]
  57. Mackenzie, R.; Watts, J. Capacity to consent to sex reframed: IM, TZ (No. 2), the need for an evidence based model of sexual decision-making and socio-sexual competence. Int. J. Law Psychiatry 2015, 40, 50–59. [Google Scholar] [CrossRef] [PubMed]
  58. European Parliament. (2017, Oct. 10). European Parliament Resolution of 16 February 2017 with Recommendations to the Commission on Civil Law Rules on Robotics. Available online: (accessed on 25 July 2018).
  59. Musial, M. Designing (artificial) people to serve—The other side of the coin. J. Exp. Theor. Artif. Intell. 2017, 29, 1087–1097. [Google Scholar] [CrossRef]
  60. Sparrow, R. Robots, rape and representation. Int. J. Soc. Robot. 2017, 9, 465–477. [Google Scholar] [CrossRef]
  61. Krishna, A.; Strack, F. Reflection and impulse as determinants of human behavior. Knowl. Action 2017, 9, 145–167. [Google Scholar]
  62. Mackenzie, R. How the Politics of Inclusion/Exclusion and the Neuroscience of Dehumanisation/Rehumanisation Can Contribute to Animal Activists’ Strategies: Bestia Sacer II. Soc. Anim. 2011, 19, 405–422. [Google Scholar] [CrossRef]
  63. Mackenzie, R. Bestia Sacer and Agamben’s Anthropological Machine: Biomedical/legal Taxonomies As Somatechnologies of Human and Nonhuman Animals’ Ethico-political Relations. In Law and Anthropology: Current Legal Issues; Freeman, M., Ed.; Oxford University Press: Oxford, UK, 2009; pp. 484–523. [Google Scholar]
  64. Eyal, N. Hooked: How to Build Habit-Forming Products; Penguin: London, UK, 2016. [Google Scholar]
  65. Billieux, J.; Schimmenti, A.; Khazaal, V.; Maurage, P.; Heeren, A. Are we pathologizing everyday life? A tenable blueprint for behavioral addiction research. J. Behav. Addict. 2015, 4, 119–123. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  66. Frank, L.; Nyholm, S. Robot sex and consent: Is consent to sex between a robot and a human conceivable; possible, desirable? AI Law 2017, 25, 305–323. [Google Scholar] [CrossRef]
  67. Knausgaard, C. Fellow Creatures: Our Obligations to other Animals; Oxford University Press: Oxford, UK, 2018. [Google Scholar]
  68. Nordberg, A.; Minssen, T.; Holm, S.; Horst, M.; Mortensen, K.; Moller, B.L. Cutting edges and weaving threads in the gene editing [r]evolution: Reconciling scientific progress with legal, ethical and socialconcerns. J. Law Biosci. 2018, 5, 35–83. [Google Scholar] [CrossRef] [PubMed]
  69. Prescott, T. The AI Singularity and Runaway Human Intelligence. In Proceedings of the Conference on Biomimetic and Biohybrid Systems, London, UK, 29 July–2 August 2013; Springer: Berlin, Germany, 2013; pp. 438–440. [Google Scholar] [Green Version]
  70. Moulin-Frier, C.; Fischer, T.; Petit, M.; Pointeau, G.; Puigbo, J.Y.; Pattacini, U.; Low, S.C.; Camilleri, D.; Nguyen, P.; Hoffmann, M.; et al. DAC-h3: A proactive robot cognitive architecture to acquire and express knowledge about the world and the self. arXiv, 2017; arXiv:1706.03661. [Google Scholar] [CrossRef]
  71. Pointeau, G.; Dominey, P.F. The role of autobiographical memory in the development of a robot self. Front. Neurorobot. 2017, 11. [Google Scholar] [CrossRef] [PubMed]
  72. Reggia, J.A.; Katz, G.; Huang, D.W. What are the computational correlates of consciousness? Boil. Inspired Cogn. Arch. 2016, 17, 101–113. [Google Scholar] [CrossRef] [Green Version]
  73. Sekiguchi, R.; Ebisawa, H.; Takeno, J. Study on the environmental cognition of a self-evolving conscious system. Procedia Comput. Sci. 2016, 88, 33–38. [Google Scholar] [CrossRef]
  74. Verschure, P.F. Synthetic consciousness: The distributed adaptive control perspective. Philos. Trans. R. Soc. B 2016, 371, 1–23. [Google Scholar] [CrossRef] [PubMed]
  75. IEEE Standards Association. The IEEE global initiative on ethics of autonomous and intelligent systems., 2017. Available online: (accessed on 25 July 2018).

Share and Cite

MDPI and ACS Style

Mackenzie, R. Sexbots: Customizing Them to Suit Us versus an Ethical Duty to Created Sentient Beings to Minimize Suffering. Robotics 2018, 7, 70.

AMA Style

Mackenzie R. Sexbots: Customizing Them to Suit Us versus an Ethical Duty to Created Sentient Beings to Minimize Suffering. Robotics. 2018; 7(4):70.

Chicago/Turabian Style

Mackenzie, Robin. 2018. "Sexbots: Customizing Them to Suit Us versus an Ethical Duty to Created Sentient Beings to Minimize Suffering" Robotics 7, no. 4: 70.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop