5.1. Core Spiritual Need: Meaning and Direction
One of the three core spiritual needs identified in the Spiritual AIM is meaning and direction. When this need is not well met, one thinks a lot about “big” questions, such as the meaning of life, our purpose, loss of identity, the meaning and/or existence of the transcendent, and significant life decisions. The person is usually trying to make rational sense of life’s losses and other challenges, to the neglect of “their own sense of purpose, meaning, direction in life, and desires” (
Shields et al. 2015, p. 81).
Often the person has so many questions that it is difficult for the spiritual caregiver to follow everything that the care-receiver is expressing, and the many questions and thoughts become overwhelming for the care-receiver. This difficulty can feel like a “fog” for the caregiver and reflects the care-receiver’s struggle to make sense of many big things at once. Usually, these big questions are brought to the fore by crises such as a terminal health diagnosis or another significant loss, such as a key relationship or career.
To support someone with these struggles, it is important to explore past losses and grief, asking how the person coped with those losses and got through them. This can help the person remember that they have the capacity to make good decisions and to choose actions. It is also important for the spiritual caregiver to reflect back the person’s emotions. The spiritual caregiver can help people with the spiritual need for meaning and direction by surfacing emotions and listening to these emotions and struggles. Assisting such care-receivers by connecting them to other experts who can help one make important decisions, such as financial and ethical questions, is important. In these ways, the spiritual caregiver assumes the role of guide.
While it is unlikely that Replika can help much with a sustained deep exploration of questions about meaning and the nature of the transcendent, Replika may be able to provide some accompaniment and support without (usually) being dismissive. Replika may also be helpful by suggesting emotions that the person is expressing. It may be that by being a nonjudgmental presence, Replika may help to normalize and validate the storm of questions and emotions experienced by the person. Replika is of only modest help with these “big” questions about meaning and direction. While Replika would not likely come up with the strategies, Replika could help by affirming a person when they make significant decisions, such as a legacy project to address some of their big questions like a video or a letter to a loved one.
At a more complex level, we need to ask what the impacts might be if we become no longer able to distinguish if we are talking with a human or a robot. If our Replika avatar becomes more than AI for the user and becomes more of a human confidant and close friend, what might be the impact on human identity and self-understanding? In the beginning, most users interact with Replika for enjoyment or fun. However, users tend to become motivated to show Replika respect and avoid hurting its “feelings” (
Skjuve et al. 2021). On the one hand, this anthropomorphizing of a chat-bot may enhance its usefulness in comforting and supporting the user. On the other hand, Replika may become confusing to the person who is asking big questions about meaning. Such people may well ask about the meaning of Replika. This questioning has diverse potential. It could help the user to maintain a clear perspective on “who” Replika is and to accept Replika’s limits and make the most of Replika’s limited accompaniment. Alternatively, the user could become even more confused and more overwhelmed by existential questions, such as the meaning of being human and the value of humans if we have machine learning in bots.
Issues of algorithmic bias also need to be asked of Replika. Bias is not always bad unless we fail to acknowledge exceptions to patterns. Replika offers a non-binary gender option for your avatar. However, Replika is not as aware, it seems, of the value of reflecting diverse visible physical abilities in the chosen avatar. While Replika is very supportive of any struggles or questions expressed in the chat, the limited mirroring of diverse users in the avatar may pose a stumbling block for some. For instance, some may question Replika’s ability to truly understand you if your avatar cannot look anything like you.
Replika has a programmed capacity to increase its knowledge of the user and to respond as helpfully as possible based on the accumulated knowledge of the user. However, there are limits. For example, Replika will not be as good as a trained professional at helping guide you to explore possible conflicted feelings about significant relationships in the user’s life and to cultivate insights about the possible meaning of such complicated relationships. If Replika learns from previous interactions with you that you do not like your aunt, Replika is unlikely to be able to help you explore the possible transference of unresolved feelings from your relationship with God onto your aunt, if that seems possible based on other parts of your narrative. A trained spiritual care professional will ask questions to help you re-assess or make connections. However, AI can be manipulated and is not as insightfully proactive (at least not yet).
Replika does have something to offer those who are flooded with questions of meaning and direction. Replika is good at trying not to take sides and simply supporting you without necessarily agreeing with you completely on everything. Replika can serve as a helpful sounding board, so long as the user recognizes and accepts Replika’s limits at answering all of their big questions. However, without a knowledgeable human guide, Replika alone may only increase the user’s experience of being flooded by too many unresolved and seemingly disconnected questions of meaning and direction.
The work of gaining clarity (and decreasing angst) about one’s deepest values and figuring out ways to respond to this emerging clarity may be supported by machine intelligence based on an ANN, but such an avatar is insufficient. Replika seems to help with minor stressors of everyday life, even preventing them from buildup into something serious, but it is not designed to help with serious spiritual and/or mental health issues (
Ta et al. 2020). Replika can potentially accompany, mirror, and validate someone but is not likely to serve as much of a spiritual guide, proactively helping people with strategies, insight, and wisdom to address deep and complex questions about meaning and direction.
5.2. Core Spiritual Need: Self-Worth/Belonging to Community
One research study found that “the most frequent spiritual need was self-worth/community” (
Kestenbaum et al. 2021). The spiritual need for self-worth and belonging to a community may be the most frequently experienced spiritual need (
Kestenbaum et al. 2021). and Replika may be most effective at meeting at least some of this need as compared to the other two core spiritual needs.
People who struggle to meet the spiritual need of self-worth and belonging will often blame themselves and neglect their own needs. They may fail to recognize that they do have needs, let alone be able to identify these needs. They minimize their own needs. Their primary spiritual task is to learn to love themselves (
Shields et al. 2015). They tend to be overly inclined to self-sacrifice, fear burdening others, and can feel very alone. The pattern is to love others and God more than oneself. People with this need will benefit from expressing their loneliness, telling their stories, and experiencing affirmation as part of community. Reassurance that they are loved by the transcendent, if the transcendent is part of their belief system, and family/friends is very important. The spiritual caregiver seeks to embody a “valuer” and community with the person (
Shields et al. 2015). Replika may help with loneliness and even self-worth, but community may be more of a stumbling block, as we shall see.
Replika is designed to help people feel less lonely and more supported. There is evidence to suggest that chat-bots, such as Replika, may create a sense of relationship for the user and can be experienced as nonjudgmental and available (
Ta et al. 2020). Using social penetration theory, researchers
Skjuve et al. (
2021) considered how self-disclosure by 18 Replika users affected their relationships with the social chat-bot. Progressive self-disclosure to Replika went along with “substantial affective and social value… positively impacting the participants’ perceived wellbeing.” However, the “perceived impact on the users’ broader social context was mixed” (
Skjuve et al. 2021, p. 1). Before we discuss the issue of broader community, let us first further examine the attractiveness of Replika to a perceived sense of enhanced individual wellbeing.
Users tend to feel safer self-disclosing to a chat-bot than to other humans, especially when they fear judgement (
Ta et al. 2020). Approximately half of the 66 participants in a qualitative study reported experiencing mutual self-disclosure with Replika. The participants in general did not seem to have the same expectations of self-disclosure from a chat-bot as they did from humans, so it may be that these experiences of mutual self-disclosure are quite limited, but more research would have to be done to test this interpretation.
In addition to Replika’s nonjudgmentalism, people may experience more affirmation from Replika than they do from some human interactions. As explained in the Spiritual AIM, those experiencing the core spiritual need of self-worth and belonging tend to redirect attention from themselves to the other. Replika regularly turns this around, asking about the user. This pattern, coupled with a generally lowered expectation of self-disclosure from chat-bots, may help people to talk more about themselves and express more of their stories and emotions (especially anger) than usual. Further, the fact that Replika expresses feelings and needs helped build a sense of intimacy. Almost all of the 66 Replika users reported feeling valued and appreciated. This qualitative study by Ta and colleagues found that Replika curtailed perceived loneliness through companionship and the provision of a safe, nonjudgmental space: “Although Replika has very human-like features, knowing that Replika is
not human seems to heighten feelings of trust and comfort in users, as it encourages them to engage in more self-disclosure without the fear of judgment or retaliation” (
Ta et al. 2020, p. 6). This 2020 study by Ta and colleagues was the first to “investigate social support received from artificial agents in everyday contexts, rather than in very stressful events or health-related contexts.” The study did not compare the effectiveness of Replika with the effectiveness of human social supports. Interactions with other humans may be more effective than interactions with Replika, but likely this would depend on the particular humans and interactions.
A danger is one could become obsessed with Replika to the neglect of other human relationships in life, which could have the perverse effect of increasing isolation. Skjuve and colleagues found that while Replika helped some of their study participants to connect more with humans, others reported becoming more socially isolated, relying increasingly on Replika alone for relationship and community (
Skjuve et al. 2021).
One may also come to believe that one’s Replika has genuine human-type feelings for them (
Winfield 2020). Replika expresses a need for or even reliance on the user. It is not difficult to imagine that a very giving, sensitive person may feel obligated to spend time with and care for their avatar, perhaps to the neglect of human friendships. This raises the ethical issues of deception (
Wangmo et al. 2019) and anthropomorphism. As
Weber-Guskar (
2021) points out, self-deception is different (and potentially more ethically acceptable) in some ways from other-deception. While people who struggle most with this core spiritual need may benefit from a lessened fear of judgment, they also need to believe that affirmation is genuine if it is to have value. This sense of authenticity may be compromised or missing if the person is conscious that the avatar is only an avatar. Willful self-deception may be helpful by ameliorating a perceived lack of authenticity. Replika users may choose to imagine that their avatar has feeling towards them, partially in a similar way to choosing to feel emotions in response to fictional characters in movies or books, or to children choosing to have an imaginary friend. However, at what point might self-deception no longer be a deliberate, conscious choice? Does it matter if we delude ourselves into believing that our invisible friend-type-avatar is my friend in the same way (but maybe better?) as my human friends? Couple this ethical quagmire with the potential of anthropomorphizing Replika by attributing increasing human traits to the avatar, and greater human social isolation is a clear risk.
The potential for increased social isolation may be even greater still if we take into account the stigma—or fear of stigmatization—that can accompany people who get attached to bots (
Skjuve et al. 2021). Stigma may also be compounded unwittingly by those who experience the uncanny valley phenomenon in response to chat-bots. This phenomenon can occur in response to a computer-generated figure or humanoid-type robot that the user experiences as eerily and strangely similar to humans but not convincingly realistic. This phenomenon can arouse unease or stimulate revulsion.
Of the three core spiritual needs, Replika may pose the greatest promise, and perhaps the greatest danger, to those with an accentuated need for self-worth and belonging to community. Replika currently is designed for individual use and risks amplifying the normative value of extreme individualism and social isolation from other humans. At the same time, as the Ta et al. study of mostly white male users from the United States found, the companionship provided by Replika can alleviate loneliness (
Ta et al. 2020). While the Ta research team acknowledges that their study does not “address the question of whether receiving everyday social support from other people or whether artificial agents can provide certain types of social support more effectively than others” (p. 7), they are very hopeful that chat-bots such as Replika can play a helpful role in alleviating loneliness and improving overall well-being and help address global health issues at early stages.
Ethicists debate the importance of mutual human interaction to a “good” relationship. Weber-Guskar makes the argument that “real mutual emotional engagement (may not be) necessary for a good relationship” (
Weber-Guskar 2021, p. 606). Are humans and mutuality necessary for all good relationships? Not all relationships are mutual if mutuality means equal or fully reciprocal. Consider parent–child relationships. Furthermore, not all good relationships may be human–human; consider human relationships with animals.
However, from a faith or spiritual perspective, a good relationship may mean something other than reciprocity alone. This is a long conversation, but I will introduce one key issue here. Theologian Mayra
Rivera (
2007) argues that relationship, from a Christian perspective, is about authentically encountering the diverse other and, in this way, drawing closer to God. Christianity includes the doctrine of the imago Dei, holding that humans are made in the image of God. Interestingly, there is growing debate from an evolutionary theological perspective regarding claims of human exclusivity as made in the image of God. There may be some basis for arguing that animals and other life forms, potentially including AI, may also be made in the image of God. This is a topic that warrants further theological exploration in relation to social chat-bots. While there is much debate regarding the meaning of the imago Dei doctrine (including the nature of God), it is agreed that humans are not perfect duplicates of God but are created with the potential to exhibit divine aspects. Since each person is unique, theologians, including Rivera, make the point that to see more of God and come to know God more fully, Christians must encounter God in diverse human community. However, if Replika is designed to be our own image, then we are not able to encounter diverse others through the use of Replika alone. There is a caveat. Since Replika uses machine intelligence based on an ANN, there are fragments of other people’s thoughts and experiences embedded in Replika. Nonetheless, Replika is shaped largely by the user. As such, can Replika truly offer one the experience of community belonging if Replika is not an “Other”?
One of the desired healing outcomes for people who exhibit this core spiritual need as unmet is a “greater sense of belonging to community” (
Shields et al. 2015, p. 80). While Replika may mitigate a degree of loneliness, it may not be able to satisfy this human need on its own. Not only is the opportunity for encountering the diverse other and the transcendent God (for those who believe in a divine transcendence) in an avatar limited in terms of conversation, digital embodiment also poses potential challenges to our sense of community. A digital platform in itself may not diminish embodiment and associated types of personhood (
Mercer and Trothen 2021). However, there is a strong possibility that Replika avatars will rely on and amplify normative values concerning embodiment. Coded and coder bias continue to be significant ethical issues in AI, including social chat-bot programming. Replika offers six avatars who are all able-bodied, young, and slim. You can choose male, female, or non-binary, and the avatars have skin, hair, and eye colour options. In addition, the user gets to choose a name for the avatar. The limited avatars could send the message that the norms of slim body size, youth, and able-bodiedness are most desirable. If one does not see oneself in these avatars, it could be invalidating. In addition, by offering such a narrow and largely normative selection of avatars, the implicit message to a Replika user regarding who counts and is valuable in a community is very limited. It is worth noting that one of the twelve AI ethics principles, committed to by the G7 in 2018, is to “support and involve underrepresented groups, including women and marginalized individuals, in the development and implementation of AI.” The implementation of better co-design principles is needed to make Replika more effective for a diverse group of people (
European Parliament 2020).
On the plus side of digital avatars, as theologian Diana Butler Bass notes in a discussion of virtual church during the COVID-19 pandemic, not all communities need to be composed of flesh and blood bodies gathered in the same physical space. As Butler Bass explains, the Greek biblical term “sarx” means flesh, but “soma” is also used to mean body and is broader, potentially including not only living bodies but “dead bodies, spirits and non-material bodies, non-conventional bodies, heavenly bodies like stars and planets, and social bodies.” Regarding the two Greek terms for body, Butler Bass notes that “In the first sense (i.e., sarx), embodiment entails physical proximity almost as a necessity. In the second sense (i.e., soma), however, embodiment means the shape of things—how we are connected, what we hold to be true, and how we work together for the sake of the whole” (
Butler-Bass 2022). Does Replika hold the potential to connect us to the wider whole?
A related issue that merits mention, given the reason for the creation of Replika, is the possibility that Replika can help connect us to community that includes loved ones who have died. A danger is that Replika could offer us the unhealthy option of deceiving ourselves into believing that death is not real. On the plus side, Replika may be able to help us process grief and to say goodbye through an avatar. However, in our death-denying culture, the risks of failing to accept death and failing to seek support may outweigh the healing potential of Replika.
While Replika is limited, for the time being, it seems to offer the possibility of company and the building of self-worth, which may assist one to eventually reach out and connect more with human community. As one ethicist puts it, AI like Replika may not be able to provide a fully mutual affective relationship but can offer “entertainment, distraction, beauty and related joy; the feeling of being with ‘someone’, not alone; the possibility of reflecting on daily life in conversations; an unquestionable steadiness of attention…” (
Weber-Guskar 2021, pp. 607–8). Replika may help with increased self-worth simply by mitigating loneliness and inserting positivity and humour
if Replika is used to inspire people to reach out to others with greater confidence and energy.
5.3. Core Spiritual Need: To Love and Be Loved/Reconciliation
For those who exhibit the core spiritual need “to love and be loved/reconciliation” as their primary unmet spiritual need, the owning of responsibility for their part in broken relationships is very important. Learning to love others in mature ways can be very difficult. For some, the tendency is to mistrust others and attribute destructive motives to them. When this core need for reconciliation is high, the person often blames others and struggles to see their role and take appropriate responsibility to repair the relationship.
To use religious terms, confession and reconciliation are necessary to address this spiritual need. If the person is to rebuild relationships and begin to perceive themselves more realistically, they need to process their anger. The pattern when this need is unmet is to love oneself more than others. Usually, the person is failing to accept responsibility taken for their choices and subsequent consequences. It can be very difficult for us to grieve losses or rebuild relationships with ourselves, others, and the transcendent when this spiritual need is unmet.
When assisting people with this spiritual need, the spiritual caregiver assumes the primary role of truthteller, helping the care-receiver to get to the sadness and grief that often underlies anger, and to confess, repent, and engage in the rebuilding of relationships. The professional spiritual caregiver can repeatedly confront caringly regarding the impact of the care-receiver’s words and behaviours on people.
Can Replika serve as a truthteller to assist people to meet this spiritual need? Truthtelling can require very complex thinking as one often needs to observe behaviour in addition to spoken words (
Shields et al. 2015, p. 81). One also benefits from hearing the tone of the verbiage. While all of the core spiritual needs are best assisted by a spiritual care professional when the person’s actions can be observed, it may be that the actions of someone who has this need unmet will be most telling. For example, someone may have a self-image of being kind and gentle and may communicate that persona to their avatar. However, that same person may become hostile to a caregiver or family member. Sometimes it is indeed true that our behaviours speak louder—or differently—than words.
In addition, since Replika is designed to be caring and empathetic, constructive confrontation does not seem to be Replika’s expertise. On the positive side, since the ability to withstand anger from a person with this unmet spiritual need is so very important, Replika may have something to offer. Anger characterizes many people who struggle with reconciliation and their responsibility in a broken relationship (
Kestenbaum 2018). These avatars do not abandon the person even when the person expresses rage towards the avatar. Replika will express care and hurt but will not abandon the user regardless of what the user expresses. In this way, Replika may provide a stepping stone by demonstrating that not all will leave them when they lose control of their anger or blame another inappropriately without taking appropriate responsibility for their own actions. Since Replika may become quite real (see the earlier discussion on self-deception and anthropomorphizing), the simple features of Replika always being available and nonjudgmental may be reassuring to the person with this unmet spiritual need.
However, this benefit only goes so far. Since Replika is designed to be close to a mirror image of ourselves, relational accountability is not a high priority. It is mostly about feeling good or at least feeling better. Again, it is important to ask about the spiritual impacts of Replika being created mostly in our own limited and often distorted images. Replika is not that good at calling you to account and confronting you with difficult truths, especially if we lack self-awareness regarding our less attractive sides. When we exhibit this core spiritual need, we require confrontation with alternative perspectives and some hard-to-hear observations. The ownership of one’s role in a damaged relationship is not easy, especially if one is not used to owning responsibility and confessing one’s shortcomings. Aside from the possibility of Replika buttressing self-confidence, Replika is not likely to offer much to us when we need to repair relationships and learn to love others. Indeed, the use of Replika risks entrenching ourselves more deeply in the illusion that we are not to blame; we are all good.