Next Article in Journal
Surrendering to and Transcending Ming 命 in the Analects, Mencius and Zhuangzi
Previous Article in Journal
The Church and Academia Model: New Paradigm for Spirituality and Mental Health Research
Previous Article in Special Issue
Wisdom of the Heart: A Contemporary Review of Religion and AI
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Use of Artificial Intelligence Tools for Religious Purposes: Empirical Research Among Hungarian Religious Communities

1
Institute of Communication and Media Studies, Department of Communication Studies, Pázmány Péter Catholic University, H-1028 Budapest, Hungary
2
Institute of Sociology, Pázmány Péter Catholic University, H-1028 Budapest, Hungary
3
Doctoral School of Education, Eszterházy Károly Catholic University, H-3300 Eger, Hungary
*
Author to whom correspondence should be addressed.
Religions 2025, 16(8), 999; https://doi.org/10.3390/rel16080999 (registering DOI)
Submission received: 25 June 2025 / Revised: 26 July 2025 / Accepted: 28 July 2025 / Published: 31 July 2025
(This article belongs to the Special Issue Religious Communities and Artificial Intelligence)

Abstract

This study empirically investigates the use of artificial intelligence (AI) tools within Hungarian religious communities, with a focus on Catholic respondents, to assess their awareness, application, and acceptance of AI in religious contexts. By religious communities, we do not mean monastic or priestly communities, but rather communities of lay religious people. Conducted between 10 February and 11 March 2025, the questionnaire-based research (N = 133) employs Campbell’s Religious Social Shaping of Technology (RSST) framework to analyze attitudes toward AI across 15 religious functions. Six hypotheses explore gender differences, religiosity types (church-based vs. self-defined), and the acceptability, authenticity, and ethicality of AI applications. Findings reveal high acceptance for administrative tasks (e.g., email list updates: 64.7%) and technical functions (e.g., live translation: 65.4%), but low acceptance for spiritual roles (e.g., spiritual leadership: 12.8%). Self-defined religious individuals are significantly more accepting, perceiving AI as more authentic and ethical compared to those adhering to church teachings. No significant gender differences were found. The study contributes to digital religion studies, highlighting the influence of religiosity on AI adoption, though its non-representative sample limits generalizability.

1. Introduction

The rapid advancement in artificial intelligence (AI) technologies has permeated diverse societal domains, including religion, prompting a need to examine their implications within specific cultural and religious contexts. In Hungary, a country with a rich tapestry of religious traditions, the integration of AI into religious communities presents a unique area of inquiry. This study focuses on the use of AI tools within Hungarian religious communities, particularly among lay members, to understand their awareness, application, and acceptance of these technologies. The significance of this investigation lies in the diverse approaches to AI among Hungarian religious groups, shaped by their theological and institutional frameworks, as well as the broader socio-cultural context of 2025. Hungarian religious communities exhibit varied stances toward AI adoption. The Hungarian Catholic Church, guided by the Vatican’s teachings and its January 2025 document on AI ethics (Antiqua et Nova 2025), adopts a cautious yet structured approach, emphasizing alignment with doctrinal principles. In contrast, Protestant churches in Hungary lack centralized regulations, leading to a more heterogeneous engagement with AI technologies. This divergence underscores the need to explore how religious orientations influence AI’s integration into religious practices. AI applications in these communities are multifaceted, ranging from clergy using AI for sermon preparation and scriptural searches to lay members leveraging AI for spiritual guidance or facilitating community information exchange. Additionally, AI may automate elements of religious rituals, such as processing playback music during services, reflecting its potential to reshape traditional practices. Recent workshops in Hungary1 on the intersection of religion and AI highlight growing interest in this domain, yet significant research gaps persist. Internationally, digital religion studies have explored AI’s role in religious contexts, but these investigations often focus on Western or globalized settings, overlooking Central European perspectives. In Hungary, empirical research on AI’s application within religious communities remains scarce, particularly regarding lay perspectives and the influence of religiosity types (e.g., church-based vs. self-defined). This gap is critical, as Hungary’s unique religious landscape—marked by a blend of traditional Catholicism, diverse Protestant denominations, and emerging self-defined spiritualities—offers a distinct context for studying AI’s social shaping within religious settings.
This study addresses these gaps by empirically investigating AI use among Hungarian religious communities, with a focus on Catholic lay respondents, using Campbell’s Religious Social Shaping of Technology (RSST) framework. The research, conducted between 10 February and 11 March 2025, with a sample of 133 participants, examines attitudes toward AI across 15 religious functions, exploring six hypotheses related to gender differences, religiosity types, and the acceptability, authenticity, and ethicality of AI applications. The objectives are to: (1) assess the extent of AI awareness and adoption in religious practices, (2) evaluate the influence of religiosity types on perceptions of AI’s authenticity and ethicality, and (3) identify the acceptability of AI across various religious functions, from administrative tasks to spiritual roles. By addressing these aims, the study contributes to digital religion scholarship, offering insights into how religiosity shapes AI adoption in a Central European context, while acknowledging limitations due to its non-representative sample. The aim of the present study is to empirically examine the extent to which Hungarian—primarily Catholic—religious communities are aware of the potential applications of artificial intelligence in religious contexts, the areas in which they utilize these technologies, and the degree to which they accept or endorse their use.

2. Literature Review and Theoretical Framework

The growing presence of artificial intelligence (AI) is not a recent phenomenon. Because of rapid developments in information technology and ongoing global socio-economic transformations, the role of AI has become a recurring topic in both public discourse and academic inquiry. Although the origins of AI date back to the mid-20th century, the technological advances of recent years—particularly the widespread availability of generative AI tools such as ChatGPT, launched in late 2022—have opened new dimensions in the examination of its societal and educational applicability (Horváth 2023; Vaughan et al. 2025; Turós et al. 2025).
The development and diffusion of AI tools have occurred at an exceptionally fast pace, resulting in the near-daily emergence of new applications and innovations. This dynamic has sparked an intensified interest in academic literature, particularly in relation to discussions of the benefits and potential risks associated with AI (Dietz 2020). The positive and negative effects of AI are increasingly evident not only in scientific and professional contexts but also in everyday life. These developments have triggered changes in public attitudes, and, for instance, a growing fear of job displacement has been observed in relation to the expanding use of AI technologies.
The year 2022 marked a turning point in the advancement of AI applications, as the American research laboratory OpenAI released ChatGPT, a publicly accessible generative language model. ChatGPT, which stands for Generative Pre-trained Transformer, is an AI-based chatbot capable of generating contextually appropriate and coherent texts based on user prompts, and of maintaining interactive communication. Generative artificial intelligence relies on algorithms—primarily large language models (LLMs)—that learn from extensive, pre-fed datasets and can produce human-like content, including text, images, videos, and other data formats (Bukovinszky-Csáki 2025). The precursors to ChatGPT emerged in the 2010s with the development of large language models based on neural networks trained on massive text corpora. These systems became capable of performing a variety of language-related tasks, such as text generation, translation, and question answering. A key factor in the widespread adoption of ChatGPT was its public availability following a simple registration process, which made it accessible to a broad user base. As a form of communicative AI, ChatGPT is designed to engage in natural, context-sensitive dialogue with human users, thereby facilitating more effective human–machine interaction. The number of users increased dramatically shortly after its release. In response to this growing interest, a paid version offering more advanced features was launched in the spring of 2023 (Dornics 2025). The impact of ChatGPT on the field of education was also swift and significant, as illustrated by empirical data: within just one month of its release, by the end of December 2022, a total of 128,402 visits to the ChatGPT website were recorded from the Wi-Fi networks of eight major British universities. By January 2023, this figure had risen to nearly one million (982,809 visits), reflecting not only the tool’s widespread popularity but also its rapid integration—not limited to educational stakeholders (Rajki et al. 2024).
The emergence of artificial intelligence tools—particularly the release of ChatGPT version 3.5 in 2022—has significantly shifted public and scholarly attention toward communicative and generative AI. This new generation of AI technologies has ushered in the possibility that mechanization, automation, and digitization can not only replace human physical labor but also enhance human efficiency in intellectual and creative tasks. The academic community has responded cautiously yet attentively to the advances in AI, systematically assessing its current and potential future impact on various social domains in terms of both opportunities and risks. As with previous sociological studies related to technological developments, the discourse surrounding AI has given rise to both technological optimism and pessimism. Due to the profound implications of AI technologies, regulatory bodies—both at national and supranational levels—have responded rapidly, as evidenced by legal frameworks such as the European Union’s AI Act (2024).
The intense academic interest has extended into the domain of religion and faith in relation to AI. A growing number of recent studies, investigations, and critical essays have explored this intersection (Andriansyah 2023, pp. xi–xii). Notably, in January 2025, the Catholic Church—specifically the Dicastery for the Doctrine of the Faith and the Dicastery for Culture and Education—issued a joint memorandum titled “Antiqua et Nova: Note on the Relationship Between Artificial Intelligence and Human Intelligence”. The 117-point document asserts that artificial intelligence is not an artificial form of intelligence, but rather a product of it. It outlines both the opportunities and challenges posed by AI in domains such as education, the economy, employment, healthcare, human and international relations, and warfare. This document, jointly formulated by two Vatican dicasteries, addresses not only those involved in faith education and its transmission, but also those seeking to align scientific and technological advancement with the service of the human person and the common good.
Preceding this, in 2023, the Vatican launched Magisterium AI to prevent the dissemination of misinformation in the context of seeking knowledge about the Catholic Church. As stated on the Magisterium AI website: “Search engines often surface unreliable opinions, while general-purpose AI chatbots lack the specific knowledge and doctrinal fidelity required for accurate answers. Magisterium AI was created to address this challenge, providing a unique and effective tool for anyone seeking an authentic connection to the Catholic faith. Magisterium AI draws upon more than 23,000 official Magisterial documents (including the Catechism, Papal Encyclicals, Council Decrees, Canon Law) and over 2300 foundational theological works (from Church Fathers and Doctors such as Aquinas and Augustine, the Bible, and key commentaries) to inform its responses.” (https://www.magisterium.com/hu/about/why-magisterium-ai) (accessed on 24 March 2025).
The impact of artificial intelligence on churches, religions, and faith at the level of institutions, religious communities, and individual faith practice is very complex; it conceals both opportunities and threats (Reed 2021; Singler 2023; Kozak and Fel 2024). And the scientific literature to date reflects on this complexity from several directions. The relationship between Religion and AI is presented in scientific writings along two basic logics. One way of presenting it is to list and evaluate the opportunities and threats (Ocampo and Gozum 2023; Ty 2023). The other approach, when the application of AI in a specific area of religious life, religious communities, everyday religious practice, or the church institutional system is examined, could be said to be a thematic approach.

2.1. Literature on Opportunities and Threats

Among the possibilities, the researchers highlight that seeking inspiration from artificial intelligence (AI) to assist in sermon preparation helps in the religious field in the church (Vaughan et al. 2025). The practical implications of AI tools within religious contexts are noteworthy, as they offer the potential to enhance religious experiences, save time, and gain deeper insights into abstract concepts (Amegbeha 2023, p. 27). Religious organizations might use AI to personalize messages for followers or to answer common questions. However, this raises ethical concerns, such as whether a machine-generated message can truly carry spiritual meaning. New social media platforms and AI technology provide users with the opportunity to explore their religious identities through the representation of their daily lives through stories, chats, filters, reels, pictures, and images (Andok 2022; Verma 2023).
AI can be used to study religious texts, translate ancient writings, and help people understand religious teachings more deeply. For example, AI tools can analyze scripture across languages and traditions to find similarities or patterns. Chatbots can also answer questions about religion or help people learn about different faiths (Tsuria and Tsuria 2024).
Like any technology, AI can be used in harmful ways. For example, it could be used to spread extremist beliefs, fake religious messages, or manipulate people emotionally. Deepfakes and AI-generated sermons could confuse people or promote false teachings (Rahmi et al. 2024; Cartella et al. 2025). That’s why ethical guidelines are needed when using AI in religious contexts. When AI is employed in religious contexts, additional ethical challenges arise due to the deep personal, cultural, and spiritual significance of religion. These concerns build on the general ethical issues outlined above but are contextualized by the unique sensitivities of religious beliefs and practices (Ashraf 2022). The use of AI may undermine respect for religious beliefs and cultural sensitivities. AI systems used in religious contexts, such as chatbots providing spiritual guidance or AI-generated religious texts, may misrepresent or oversimplify sacred doctrines, rituals, or beliefs. For example, an AI trained on incomplete or biased datasets might produce interpretations of religious texts that are inaccurate or offensive to adherents. Such misrepresentations can lead to the commodification of sacred practices. For instance, an AI-generating sermons or prayers may lack the nuanced understanding of a human spiritual leader, potentially trivializing deeply held beliefs. The use of AI may violate personal and religious institutional autonomy, and spiritual manipulation may occur. AI systems assuming roles traditionally held by religious leaders (e.g., delivering sermons or interpreting scriptures) raise questions about legitimacy and accountability. Unlike human leaders, AI lacks personal faith, moral judgment, or accountability to a religious community. This could diminish the authority of religious institutions and create confusion among adherents about the source of spiritual guidance. For example, an AI-generated sermon might lack the moral weight of one delivered by a human leader with lived experience and accountability to their community (Tampubolon and Nadeak 2024). AI systems designed to offer personalized religious advice (e.g., apps for meditation or prayer) may manipulate users’ spiritual experiences by tailoring content to maximize engagement rather than fostering genuine spiritual growth. This can exploit vulnerabilities in individuals seeking meaning or solace. Such manipulation may undermine the authenticity of religious experiences, replacing personal reflection with algorithmically driven suggestions. This could lead to a form of spiritual coercion, where individuals’ beliefs are shaped by AI rather than free choice. In addition, AI systems may reflect biases in their training data, which could lead to the marginalization of minority religions or denominations. For example, an AI trained predominantly on texts from one religious tradition may misrepresent or exclude others, reinforcing dominant narratives. This can exacerbate interfaith tensions or alienate members of minority religious groups, violating principles of inclusivity and equity. For instance, an AI used in interfaith dialogue platforms might inadvertently prioritize one religion’s perspective, undermining mutual respect. Religious contexts often involve deeply personal disclosures, such as confessions or spiritual counseling. AI systems collecting or processing such data risk breaching confidentiality, especially if data is stored or shared without robust safeguards. How should both clergy and laypeople consider the use of AI to avoid misleading and false content? (Mannerfelt and Roitto 2025).
AI can both support and challenge religious freedom. On one hand, it allows people to express their faith more easily, through apps, websites, or social media (Ashraf 2022). On the other hand, AI algorithms sometimes filter or censor content, which might limit religious speech in certain countries or platforms (Temperman 2023). There’s also the issue of bias in AI, which might unfairly treat religious groups. It is important that AI systems respect cultural and religious diversity while avoiding discrimination.
AI might be able to provide information or even simulate spiritual conversations, but it cannot truly replace human religious leaders (Kozak and Fel 2024; Sierocki 2024). Religious communication is not only about giving answers about empathy, moral guidance, and understanding people’s emotions. AI lacks real human experience, faith, and emotional connection. It can support religious leaders by helping them manage information or reach more people, but it should not take their place in personal or spiritual matters.
The most frequently mentioned problem among the threats posed by AI to religion and faith is the threat to human dignity. Singler also notes: “..recognize the very real risks of automated decision-making systems, algorithmic bias, misinformation, and ‘hallucinating’ AI chatbots, and the impact of automating more and more of the tasks that make humans employable and give them an income to live on.” (Singler 2023, pp. 10–11).
The dangers mentioned by the American researcher Heidi Campbell go deeper in terms of the application of AI, to the depth of fundamental values. As the American researcher puts it: “It can be argued that the modern technological enterprise is based on several contentious moral values that drive much of the advancement in digital, mobile, and robotic technologies. At its root is the idea that humanity is flawed and needs to be fixed or improved upon. This has shaped the technological industry around values such as efficiency, progress, and mastery. It is one based on individualism, over a communal mindset.” (Campbell 2023, p. 23) The overvaluation of individualism is particularly problematic in religious communities. Since religious and theological knowledge is not only based on the subjective experiences of the individual but also fits into the broader horizon of the history and tradition of the community. In other words, in the religious, faith, and theological field, the communal principle is given an extremely strong emphasis (Bagyinszki 2024). Amegbeha also draws attention to the importance of all this: “Furthermore, the act of worship and communal gatherings hold deep significance for religious communities. The physical presence of individuals, their shared experiences, and the collective energy generated in these spaces are irreplaceable” (Amegbeha 2023, p. 27). The integration of AI into religious ceremonies and rituals also raises the question of their authenticity. At the same time, the emergence of technology and chatbots in the provision of religious content brings with it what users have long experienced on social media, and this is the emergence of personalized messages and content. In our case, this means personalized religious content, which is one of the problems of personalisation. The truth of this can only be accepted if we equate our personality, our self, with our digital behavior, our actions, our footprint, if we receive personalized messages in social and religious media. However, this is not true; a person is more than what they do, search for, and choose on digital platforms (Moore et al. 2017; Marshall et al. 2019). In other words, social media does not tailor the content and decisions it offers us to our entire person, but only to our digital behavior.
Returning to Campbell, the researcher highlights standardization among the characteristics of the Fordist model, which will be reflected in AI technology, solutions, decision-making strategies, and the responses of chatbots. But this standardization is very far from the religious idea of human uniqueness. Every believer has individual problems, questions, charisms, and sins, for which he expects non-standardized answers from the church and the religious community. Reflecting on these problems regarding moral values, Campbell quotes Chowdhury: “Chowdhury argues that we have allowed the technological culture to engage in ‘moral outsourcing’—deflecting and projecting moral decision-making to others outside the technological enterprise itself, creating these moral dilemmas.” (Campbell 2023, p. 24) And in this moral outsourcing issue, AI decision-making also emerges in areas such as dating apps and autonomous drone weapons (Ott 2023, p. 36).
The following four skills are considered essential for the ethical, useful, and correct implementation of AI use for religious purposes: “Literacy in the affective methods of storytelling needs to join with critical thinking, theological nous, and sociological frameworks, so we can understand this age of generative AI.” (Singler 2023, p. 11)

2.2. Examining the Use of AI in Religious Contexts

Many of the areas of AI use are mentioned in the literature and are also under research. The generation of religious texts, the support of liturgical occasions with AI tools, the use for educational and research purposes, or even robotics (Laurinyecz 2019; Rajki 2023; Tahyné Kovács 2024).
Some of the questions concern fundamentals: What is man? What is creation? Who can be a creator and who or what cannot be? (Rendsburg 2019; Ott 2023; Haselager 2024) These threads lead from the field of communication and media studies to the world of theology and philosophy, and the human sciences (Almási 2024). However, these questions also arise in discourses on posthumanism and transhumanism related to media studies (Rendsburg 2019; Campbell 2023, p. 24; Alkhouri 2024). These writings compare how humans and AI are similar and different in terms of feelings, physical experience, or learning (Amegbeha 2023, p. 27; Ott 2023, p. 36).
AI-powered applications, such as chatbots and virtual assistants, provide instant access to sacred texts, commentaries, and theological interpretations. AI systems equipped with natural language processing capabilities serve as virtual spiritual companions, offering guided meditations, prayer prompts, or ethical reflections based on religious principles. AI-driven apps can provide daily devotionals or mindfulness exercises rooted in specific faith traditions, supporting individuals in maintaining consistent spiritual practices. Such tools cater to the growing demand for personalized spiritual guidance in an increasingly digital world. AI-driven platforms can deliver personalized scriptural readings or answer doctrinal queries with tailored responses, enabling believers to deepen their understanding of religious teachings. AI technologies enhance ritualistic practices by offering immersive and interactive experiences. Virtual reality (VR) systems integrated with AI can simulate sacred spaces, allowing individuals to participate in virtual pilgrimages or recreate religious ceremonies. Additionally, AI-driven music composition tools can generate hymns or chants tailored to specific liturgical needs, enriching communal worship. These innovations enable greater flexibility in how rituals are performed, particularly for those unable to attend physical gatherings.
Religious communities can use artificial intelligence tools in many functions. As Trotta et al. write: “Today’s mass communication tools have enabled ceremonies and observances to be streamed online, extending access to religion and traditions to a growing number of people daily. For instance, internet and AI-powered technologies may help to foster online participation for members of religious communities.” (Trotta et al. 2024). Artificial intelligence tools appear as applications that assist religious events during ceremonies (Düzbayır 2025, pp. 36–37). AI also plays a role in the administrative tasks of the community and can increase efficiency in this segment. They can also help share knowledge and experiences related to religion within the community. Simmerlein conducted a survey related to The First AI Church Service: “The service at St. Paul’s Evangelical Lutheran Church in Fürth, Bavaria (9 June 2023), lasted 37 minutes. The sermon, blessings, prayers, and music were all produced by AI, utilizing accessible applications such as Pipio, AIVA, ChatGPT, and DeepL. The service itself was conducted by avatars projected on a screen, without any human intervention.” (Simmerlein 2024). The results are noteworthy, Simmerlein found generational differences in the reception of the ceremony. “…that age plays a role in the reception of the AI directing the service, the results were in line with what might be expected: comparing the group of participants under 46 and those over 45, the younger people found the experience more attractive (µ  =  −0.45) than the older participants” (Simmerlein 2024, p. 135) The acceptance of AI in religious practices hence is likely to be related to the overall knowledge about the technology (Simmerlein 2024, p. 139) However, the research does not suggest a correlation between subjective religiosity and AI emotional stimulation. This may be due to the relatively high level of subjective religiosity (Simmerlein 2024, p. 136). However, in cases where the religious self-rating of the participants in the study was not high and homogeneous, subjective religiosity was shown to be significant. Kozak and Fel’s research examined Catholic Polish students and found that the study reveals significant differences in emotional responses to artificial intelligence across individuals with different levels of religiosity, noting that more religious individuals show higher levels of fear and anger towards AI. (Kozak and Fel 2024, p. 330). In our own study, we also found differences that can be attributed to different levels of religiosity.
Other areas of AI, robotics, have also begun to see the emergence and application of human-like humanoids in religious contexts. “… recent innovations in religious robotics underscore how bots can function as new religious communication agents … various religious practices can be technically automatable, but human leadership determines where specific practices will indeed be automated and promoted with credibility.” (Cheong 2023, p. 13). Examples of religious robotics are more likely to be found in the Far East, such as the robot monk Mindar at the Kodaiji Temple in Kyoto, or Xian’er, the robot monk voiced as a nine-year-old boy and introduced at the Longquan Temple in Beijing in 2015 (Cheong 2020b). Robotics’ humanoid robot Pepper, with a tablet-like display on its chest, has been able to pose as a monk at Buddhist funeral ceremonies since the second half of the 2010s. From the Christian religions, the BlessU-2 humanoid, developed by the Protestant Church in 2017 for the 500th anniversary of the Reformation, is mentioned in the literature (Cheong 2020b; Nord et al. 2023). Cheong’s research on religious robots identified and explained four communicative affordances, including searchability, multimediality, liveliness, and extendibility (Cheong 2020a). At the same time, robotics can reinforce gender stereotypes with the physical, vocal, and naming elements of AI (Ott 2023, p. 36). As AbdelRahman and Mohammed pointed out, the main task of the Saudi Sara robot is entertainment and dancing, while the “Ibn Sina” and Zaki (smart man) robots provide banking services (AbdelRahman and Mohammed 2023, p. 18). Many robots are also used in Islamic religious contexts. Saudi Arabia has implemented the use of robots to assist visitors undertaking religious journeys to Mecca and Medina. These robots are specifically designed to offer guidance to pilgrims, providing instructions on ritual performance and offering legal advice for performing Umrah.” (AbdelRahman and Mohammed 2023, p. 18). Habib and Hung consider four hypothetical roles that can be performed by religious robots in the future. These are as follows: (1) Robots as teaching agents: Robots provide knowledge regarding the fundamental teachings of the religious faiths. (2) Robots as counselling agents: Like religious figureheads in the Christian Church, these robots can interact with people as well as counsel and help them in the light of the Canonical Laws and religious scriptures. (3) Robots as religious assistants: Robots physically assisting the religious figureheads in the accomplishment of everyday life chores and activities in the Church, synagogues, mosques, and temples. (4) Robots as religious companions: Household robots designed to provide companionship for individuals as well as provide information and demonstration regarding the different aspects of religious teachings and knowledge (Habib and La 2021, pp. 228–29). At the same time, it is important to draw attention to the fact that Jackson et al. found evidence in their empirical research that robot preachers are perceived as less credible than human preachers, and that witnessing a robot preacher predicts less religious commitment (donations to a temple) than witnessing a human preacher (Jackson et al. 2023).

2.3. Theoretical Framework

There are several theoretical frameworks available for researching the media use of religious communities, such as Campbell’s theory of Religious Social Shaping of Technology and Barzilai-Nahon and Barzilai’s (2005) theory of Cultured Technology. To compare Heidi Campbell’s Religious-Social Shaping of Technology (RSST) theory with Barzilai-Nahon and Barzilai’s Cultured Technology (CT) framework and argue for the superiority of Campbell’s theory in studying religious communities’ AI use, I first briefly outline both theories, focusing on their core principles and applicability to religious contexts. Then, I will evaluate their suitability for analyzing AI adoption in religious communities, emphasizing why Campbell’s RSST is better suited. Campbell’s RSST framework, as outlined previously, focuses on how religious communities actively shape technology through four dimensions: religious tradition and history, community values and priorities, technological negotiation and innovation, and communal discourses for legitimizing technology. RSST emphasizes the interplay between a community’s theological, cultural, and historical contexts and its engagement with technology. It is specifically tailored to religious communities, providing a nuanced lens to analyze how their beliefs and practices influence technology adoption, adaptation, or rejection. The framework is flexible yet structured, allowing for detailed examination of both the processes and outcomes of technology integration.
The Cultured Technology framework, proposed by Karine Barzilai-Nahon and Gad Barzilai, examines the interaction between technology and culture, particularly in the context of power dynamics and social structures. CT posits that technology is not a neutral tool but is shaped by cultural norms, values, and power relations within specific social groups. It focuses on how cultural factors (e.g., identity, social hierarchies, and institutional practices) influence technology adoption and how technology, in turn, reshapes cultural practices. While CT is applicable to various cultural contexts, it is not specifically designed for religious communities and emphasizes broader socio-cultural dynamics, often with a focus on political and institutional power.
RSST is explicitly designed for religious communities, with dimensions tailored to their unique theological and communal characteristics. CT, while applicable to cultural groups, is more general and does not explicitly address the theological or spiritual dimensions central to religious contexts. RSST provides specific dimensions (e.g., religious tradition, communal discourses) that align with the values and practices of religious communities. CT, in contrast, focuses on broader cultural and power dynamics, which may not fully capture the nuances of religious identity or doctrinal influences. RSST offers clear, religion-specific indicators (e.g., theological arguments, sacred text interpretations) for empirical analysis. CT, while flexible, relies on more general cultural indicators, which may require adaptation to fit religious contexts. To demonstrate the superiority of Campbell’s RSST for studying AI use in religious communities, I will evaluate its applicability against the specific challenges of AI adoption, such as ethical concerns, theological compatibility, and communal negotiation, and contrast it with CT’s limitations. RSST is purpose-built for religious communities, making it uniquely suited to analyze how AI technologies (e.g., AI-driven prayer apps, chatbots for spiritual guidance, or algorithmic content curation) align with or challenge religious beliefs and practices. Its dimensions—such as religious tradition and communal discourses—allow researchers to examine how AI is interpreted through sacred texts or theological lenses (e.g., whether AI aligns with divine will or raises concerns about idolatry). For instance, RSST can effectively analyze how a Christian community negotiates the use of AI in sermon creation by referencing historical attitudes toward automation and theological debates about human agency. In contrast, CT’s focus on general cultural dynamics (e.g., power and identity) may overlook religion-specific factors, such as the role of sacred authority or eschatological concerns, which are critical when studying AI in religious settings. RSST’s four dimensions provide a clear and systematic framework for analyzing AI adoption. CT, while capable of analyzing cultural negotiations, lacks such religion-specific dimensions, making it less precise for dissecting the theological and communal nuances of AI adoption. Campbell’s RSST has been widely applied in studies of religious communities’ media use (e.g., internet, social media), providing a robust foundation for extending its application to AI. Its established use in empirical research ensures reliability and relevance. CT, while valuable in cultural studies, has less documented application in religious technology studies, making it less tested in this domain. While both Campbell’s RSST and Barzilai-Nahon and Barzilai’s CT frameworks offer valuable insights into technology adoption, RSST is superior for studying AI use in religious communities due to its religion-specific dimensions, clear indicators, and tailored focus on theological and communal dynamics. CT’s broader cultural lens, while useful for general socio-cultural analysis, lacks the precision and depth needed to address the unique challenges of AI in religious contexts, such as theological compatibility and spiritual implications. RSST’s structured yet flexible approach makes it a more effective theoretical framework for this purpose.
Based on all this we chose Campbell’s Religious Social Shaping of Technology (RSST) theory as the theoretical framework for the research, which is particularly suitable for exploring what technological innovations each religious community uses, what they use them for and what they do not use them for, in connection with their religious precepts (Campbell 2010, 2013). The roots of the theory stem from the ideas of MacKenzie and Wajczman. They developed their Social Shaping of Technology model as a critique of technological determinism. According to them, simple causal technological determinism is not an adequate theory to describe social change. Communities will always consider whether to use the available technology or not. Within the Social Shaping of Technology theory (SST), technology is considered a social process and recognizes the possibility that social groups shape technologies towards their own goals, rather than determining the use and outcomes according to the nature of the technology (MacKenzie and Wajcman 1999). This was further developed by Campbell in relation to religious communities. RSST asks questions about how technologies are conceived and used in the context of the religious community’s beliefs, moral code, and historical tradition of relating to others (Campbell 2010). RSST emphasizes that religious communities do not outright reject new forms of technology but rather engage in discourse within the community to consider how technology might impact their community; that is, religious communities weigh the potential benefits or drawbacks of technology. To scientifically map this process—according to Campbell—we need to understand the relationship of the religious community to previous and new (communication) technology. In addition, we precisely outline the dominant values of the religious community, their meaning, significance, and relevance for the community. It is also necessary to explore the negotiation process during which religious communities decide whether new technology is compatible with the community’s beliefs and lifestyle (Campbell 2010, 2017).
Indicators that can be derived from Campbell’s theory are the following: Community decision-making processes regarding technology adoption, specific technological modifications or restrictions introduced, theological arguments or narratives supporting technology use, communal language and symbolism used to legitimize technology. Campbell’s theory underscores that religious communities are not passive recipients of technology but active agents who shape its use according to their values and goals. The application of these dimensions and indicators enables researchers to systematically analyze the dynamics of media use in religious communities, considering their historical, cultural, and theological contexts. The research by Mannerfelt and Roitto presents and analyzes this process very vividly: “Our findings highlight that preacher’s approach LLMs primarily as a tool for brainstorming and inspiration. Contrary to the fears raised in public debates, none of the preachers blindly accept or copy AI-generated content into their sermons. Instead, they act as critical evaluators, using their theological training and pastoral experience to discern what ideas generated by the LLM are relevant to their context.” (Mannerfelt and Roitto 2025).
An important part of our research was to determine for which purposes and uses members of the religious communities examined consider the use of AI acceptable and in which cases it is not. And following Campbell’s suggestion, we also asked whether they were familiar with the recommendations and regulations of their church or religious community regarding the use of AI.

2.4. The Hungarian Context of the Research: Religious Distribution Data and AI Usage Data

According to a 2017 cluster study by the Hungarian Századvég Research Center, approximately one-sixth of the Hungarian population can be considered religious in a religious sense, while the proportion of non-religious people is approximately 20%. Thus, nearly two-thirds of the population is classified as religious in their own way (Gyorgyovich 2023). The results of the 2022 census are divisive on the issue of church affiliation, as 40% of the population did not answer the question. 50.2% of those who answered the question declared themselves Catholic, 16.4% Reformed, 3.1% Evangelical, and 3.3% belonged to other smaller religious communities. 27% of those who answered the question classified themselves as not belonging to a religious community or denomination (KSH 2022).
The Hungarian data on the use and development of AI are as follows. Hungary is equipped with solid digital infrastructures and continues to progress on deployment. It should, however, focus more on the deployment of AI technologies. Hungary has increased the 2030 targets for Cloud and Data analytics to align them with the EU goals for 2030. The target set for the adoption of AI technologies continues to be significantly below the EU level target (75%), as Hungary aims at a 24% adoption rate by 2030. In 2024, 7.41% of Hungarian enterprises adopted AI (2030 national target is 24%) after a growth of 101.4% in a year, doubling the 2023 value. Although this 101.4% growth rate is much higher than the EU level growth rate (67.2%), AI adoption in Hungarian enterprises is significantly below the EU average of 13.48% (Digital Decade 2025).
According to the Artificial Intelligence Strategy of Hungary 2020–2030 document, achieving an AI adaptation rate (11.5%) by 2030 could mean a GDP surplus of HUF 6400 billion forints for the national economy. Compared to the 2030 GDP forecasts, the economic growth caused by AI is 11.5% for our region, while it is 14% globally. In comparison, Hungary sets a target of 15%. According to forecasts, the increase in productivity enabled by AI ranges between 11 and 37% at the global level. Hungary wants to achieve a 26% productivity increase for Hungarian-owned domestic manufacturing enterprises by 2030. According to estimates, 900,000 employees will be affected by AI and automation by the mid-2030s, and 2 million citizens will actively participate in the care and use of their own data using a data wallet. Additionally, by 2030, 2.5 million citizens will have received AI-assisted education (Artificial Intelligence Strategy of Hungary 2020–2030).
The context provided is critical for analyzing the use of artificial intelligence (AI) within Hungarian religious communities, as it situates the inquiry within the broader socio-cultural and technological landscape of Hungary. This response will elucidate the significance of this context in three key dimensions: the religious composition of the Hungarian population, the national AI adoption landscape, and the potential interplay between these factors in shaping AI use within religious communities. First, the demographic data on religious affiliation, drawn from the 2017 Századvég Research Center study and the 2022 census, provides essential insight into the religious landscape of Hungary. The finding that approximately one-sixth of Hungarians are religiously committed, while nearly two-thirds identify as religious in their own way, highlights a diverse spectrum of religiosity (Gyorgyovich 2023). Additionally, the census data indicates that 50.2% of respondents identify as Catholic, 16.4% as Reformed, 3.1% as Evangelical, and 3.3% as members of smaller religious communities, with 27% claiming no religious affiliation (KSH 2022). The significant proportion (40%) of non-respondents further complicates the religious profile, suggesting potential sensitivities or disengagement with institutional religion. This heterogeneity in religious identity and affiliation is crucial for understanding how AI technologies might be adopted and perceived within different religious communities. For instance, Catholic communities, as the largest denomination, may have greater institutional resources to integrate AI (e.g., in educational or pastoral activities), whereas smaller or less formalized religious groups may exhibit varied levels of engagement due to resource constraints or ideological differences. The prevalence of non-religious or loosely religious individuals also raises questions about the extent to which secular or individualistic values might influence AI adoption in spiritual contexts. Second, the data on Hungary’s AI adoption and strategic goals, as outlined in the Artificial Intelligence Strategy of Hungary 2020–2030 and related metrics, provide a critical technological backdrop. Hungary’s AI adoption rate among enterprises, at 7.41% in 2024, reflects a significant growth of 101.4% from the previous year, yet it remains below the EU average of 13.48%. The national target of 24% AI adoption by 2030, while ambitious compared to current levels, is notably lower than the EU’s 75% target, indicating a cautious approach to AI integration. The strategy’s projections of a potential 11.5% GDP surplus (HUF 6400 billion) and a 26% productivity increase in domestic manufacturing enterprises underscore the economic prioritization of AI. Furthermore, the anticipated impact on 900,000 employees and the goal of engaging 2.5 million citizens in AI-assisted education by 2030 highlight the broad societal implications of AI deployment. This technological context is vital for analyzing AI use in religious communities because it delineates the resources, infrastructure, and policy environment available to these groups. For example, religious institutions may leverage Hungary’s growing digital infrastructure to implement AI tools in administrative functions, community outreach, or religious education. However, the relatively low national adoption rate suggests potential barriers, such as limited technical expertise or funding, which could disproportionately affect smaller religious communities or those with conservative views on technology. Third, the interplay between Hungary’s religious demographics and its AI adoption landscape raises unique considerations for the integration of AI within religious contexts. Religious communities often navigate tensions between tradition and modernity, and the adoption of AI—a technology associated with rapid modernization—may be influenced by theological or ethical perspectives. For instance, Catholic and Reformed communities, with their structured institutional frameworks, may be more likely to adopt AI for practical purposes, such as sermon preparation, data management, or virtual religious services, compared to smaller or non-denominational groups that may lack resources or view AI with skepticism. The significant proportion of non-religious or unaffiliated individuals also suggests a potential demand for AI-driven spiritual or ethical tools outside traditional religious frameworks, such as apps for meditation or moral decision-making. Moreover, the government’s emphasis on AI-assisted education and data wallets could intersect with religious communities’ efforts to engage younger, tech-savvy members or to manage data for charitable activities. However, the modest national AI adoption targets and the gap between Hungary and the EU average suggest that religious communities may face challenges in accessing cutting-edge AI tools, particularly in rural areas or among less affluent congregations.
In conclusion, the provided context is indispensable for analyzing AI use within Hungarian religious communities, as it illuminates the religious diversity, technological capacity, and policy environment shaping this phenomenon. The interplay of a heterogeneous religious landscape with Hungary’s developing AI ecosystem underscores the need to consider both the opportunities for innovation and the potential barriers rooted in resource disparities, cultural attitudes, or theological considerations. This framework enables a nuanced examination of how AI is adopted, perceived, and integrated within Hungary’s religious communities, contributing to a deeper understanding of the intersection between technology and spirituality in a specific national context.

3. Research Hypotheses and Methodology

The aim of the questionnaire research was to measure the attitude of Hungarian religious people towards artificial intelligence in various areas of everyday and religious life. Through 15 specific application options, we examined whether the respondents would accept and consider the use of AI in each religious function authentically and ethically.
We defined “religious” respondents based on their self-classification: only those who identified themselves as either religious according to church teachings or religious in their own way were included in the final analysis. Respondents who did not identify as religious were excluded.
In our study, we formulated and tested six hypotheses. Each was examined using one or more statistical methods, selected based on the measurement level of the variables involved and the distributional properties of the data. Below, we outline which variables were analyzed, which statistical procedures were used, and why those tests were appropriate.
In our research, we formulated six hypotheses:
H1. 
Gender differences: Men are more accepting of the use of AI tools in religious communities.
H2. 
People who are religious in their own way, based on religious self-classification, are more accepting of the use of AI for religious purposes compared to those who are religious according to church teachings.
H3. 
People who are religious according to church teachings consider the use of AI tools in religious communities to be less credible.
H4. 
People who are religious according to church teachings consider the use of AI tools in religious communities to be less ethical.
H5. 
They are more accepted in the production of religious media content than in relation to spirituality and rituals.
H6. 
They are more accepted in religious education and research than in relation to spirituality and rituals.
To analyze the six hypotheses, we applied three types of statistical tests: Pearson’s Chi-square, Mann–Whitney U, and Spearman’s rank correlation. We chose these tests based on the level of measurement and distributional characteristics of our data. Pearson’s Chi-square test was used to assess associations between categorical variables, such as gender or type of religiosity, and dichotomized acceptance outcomes. Mann–Whitney U tests were employed to compare the distribution of ordinal-level Likert responses (e.g., acceptance, authenticity, and ethicality scores) across two independent groups (e.g., religious self-definitions), as this non-parametric method does not assume normal distribution. Spearman’s rank correlation was used to examine monotonic relationships between ordinal variables, such as age and AI acceptance scores. The use of these non-parametric methods was necessary due to the non-normal distribution and ordinal nature of most of our data. Each hypothesis and the related statistical procedure are detailed in the Results section.
Each hypothesis was tested using the most appropriate method(s), as follows (Table 1).
During our research, we published a 14-question survey (3 open and 11 closed questions) on Google Forms between 10 February and 11 March 2025, which was completed by 145 people, of whom 133 were religious.
The inclusion criteria were as follows: Hungarian citizenship, minimum age of 18, and self-identification as religious in either of the two defined ways. There was no exclusion based on denominational affiliation or any demographic aspect. Most respondents were laypeople and did not possess formal theological training. This implies that the findings primarily reflect the perceptions of religious, but non-professional believers.
The sampling procedure was carried out on a voluntary basis by means of an invitation on the 777.blog, sharing on Facebook and LinkedIn, and an e-mail sent to the lecturers of Pázmány Péter Catholic University.
Given the convenience sampling method and the relatively low number of respondents (N = 133), the results are not representative of the entire Hungarian religious population. Rather, the findings should be interpreted as indicative of trends and perceptions among a specific subset of religious people who voluntarily participated.
The design of the questionnaire was grounded in prior theoretical and empirical work. The dual typology of religiosity follows Miklós Tomka’s framework (Tomka 1999), while the selection of 15 AI application domains was based on a preliminary literature review.
The survey items measuring acceptance, credibility, and ethicality were developed from recurring themes.
In our research, we also asked respondents about their own religious self-classification. In our terms, we followed the formulation of the Hungarian sociologist of religion, Miklós Tomka, who distinguished two categories, one category being religious according to the teachings of the church, and the other group being religious in their own way (Tomka 1999). This latter category can be compared to Grace Davie’s category of believing without belonging (Davie 1990, 1997). These categories also appear in Jeff Astley’s approach, in the distinction between ordinary religion and institutional religion (Astley 2002, 2016). In other words, the separation of the two groups, each of which defines itself as a believer, but one group (institutional/religious according to the teachings of the church) follows the requirements of the institutional church more closely, for example, in terms of attending mass, going to confession, and observing fasting. The other group relies more on their personal beliefs and experiences and follows the rules of the institutional church less (ordinary religion, religious in their own way). In this study, we follow the Hungarian author category in the names.

4. Results

4.1. Demographic Data

Of the respondents relevant to our research (N = 133 people), 42.1% were male and 57.9% were female. In terms of generational distribution, 21.1% were Z (13–28 years old), 20.3% were Y (29–44 years old), 47.4% were X (45–60 years old), and 11.3% belonged to the Baby Boomers and Silent Generation (over 60 years old). In terms of the highest educational level, 55.6% of the respondents had a university degree (MA/MSc), 27.1% had a college degree (BA/BSc), 14.3% had a high school diploma, and 3.1% had less. Regarding their religious affiliation, 64.7% of those who completed the questionnaire were Roman Catholic, 0.8% Greek Catholic, 9.8% Reformed, 3.8% Evangelical, 17.3% other Christian, and 3.8% other.
Based on self-identification, 75.9% of our sample considered themselves religious according to the teachings of the church, while 24.1% considered themselves religious in their own way.
In our study, we conducted an examination along two demographic variables (gender and self-definition of the nature of religiosity) in our hypotheses; we compared the demographic characteristics of the attributes belonging to them. In the case of gender, there is a significant difference in their attitude towards religiosity. In our sample, men are overrepresented in the category of those who are religious according to teachings of the church (83.9% vs. 70.1%) but underrepresented among those who are religious in their own way (16.1% vs. 29.9%). Based on the Pearson Chi-square test (p = 0.066), the relationship between gender and type of religiosity does not reach the conventional 0.05 significance level.
Regarding religious identity, among those who consider themselves to be religious in a religious sense, Catholicism is overrepresented (68.3% vs. 53.1%), while other denominations (31.7% vs. 46.9%) are underrepresented. However, none of the Chi-square tests showed a significant relationship. According to their church, those who are religious in a religious sense tend to be older (X and baby boomers: 64.4% vs. 40.6%), while those who are religious in their own way tend to be members of the younger generation (35.6% vs. 59.4%). When examining four age generations, the Pearson Chi-square test (χ2(3) = 11.311, p = 0.010) and the Likelihood Ratio (p = 0.003) show a significant relationship between generations and type of religiosity, suggesting that age strongly influences religious attitudes. However, in one cell, the expected frequency is below 5, which somewhat reduces the reliability of the test. When divided into two generation categories (1. Generation Z and Y, 2. X and the other older generations), the Pearson Chi-square test ((χ2(1) = 5.643, p = 0.018)) shows a significant relationship between generation groups and type of religiosity, confirming that age has a significant effect on religious attitudes.
We asked whether they were aware of their church’s guidelines on the use of AI. 27.8% of respondents answered yes, and 72.2% answered no. Among Catholics, the number of those who said yes (30.7%) was slightly higher, and the number of those who said no was lower (69.3%). The fact that only a third of respondents were aware of the church’s guidelines may have been due to the fact that the Vatican issued the document Antiqua et Nova barely a month before the survey (See Table 2).

4.2. Acceptability of the Use of AI Tools in Different Religious Functions

Table 3 examines the acceptability of artificial intelligence (AI) tools in 15 different religious functions, measuring the attitudes of the respondents on a four-point scale (“Not at all”, “Rather not”, “To a small extent”, “To a large extent”). We also calculated two aggregate indicators from the data: the “Not accepting” (“Not at all” + “Rather not”) and “Accepting” (“To a small extent” + “To a large extent”) ratios, which show the positive and negative perceptions in percentage form. The aim of the analysis is to explore in which areas the use of AI in the religious context is accepted and where there is resistance. The functions listed cover technical (e.g., updating email lists), educational (e.g., religious education), creative (e.g., creating media content), and spiritual (e.g., spiritual leadership) roles.
Based on the level of acceptance, we can identify three main categories:
  • High acceptance areas (Acceptable > 60%), such as using it as live translation (65.4%), are probably due to the technical advantages in overcoming language barriers. Updating an email list (64.7%): An administrative task where the efficiency of AI is clearly beneficial. Budget planning (61.7%), Information search (61.6%), Music playlist (60.9%): In these technical or support roles, AI is considered acceptable by more than 60% of respondents.
  • Medium acceptance areas (Acceptable 40–60%) such as creating a newsletter (57.9%), event planning (56.4%), and theological research (54.1%). In these cases, AI is accepted as a support tool, but not to a significant extent, which may indicate a partial rejection of the creative or intellectual nature. Religious education (43.6%) and visual content creation (40.6%) are at the lower end of the middle range, indicating uncertainty about the role of AI.
  • Areas with low acceptance (Acceptance < 40%). The lowest acceptance is for use as a spiritual guide (12.8%), which indicates that respondents expect a human presence and emotional connection in this role. Community Management (19.6%), Personalized Messages (26.3%), Text Media Content (29.4%), Podcast Creation (36.1%): The “Not Accepting” rate in these functions exceeds 60%, especially for tasks with personal or spiritual content (e.g., preaching: 70.7% not accepting).
It can also be observed that the acceptance of AI is higher where it can provide practical, administrative, or technological support (e.g., translation, email list, budget). The “Accept” rate here is above 60%, and the “To a large extent” answers often exceed 27%, indicating a strong positive attitude. Opinions are divided in the case of religious education (43.6%) and theological research (54.1%), with “To a small extent” answers dominating (33.8% and 35.3%), reflecting cautious acceptance. Creative tasks (e.g., visual content: 40.6%, podcast: 36.1%) show lower acceptance, probably due to the preference for human creativity. Spiritual and leadership functions: AI is strongly rejected here (e.g., spiritual leadership: 87.2% do not accept, community leadership: 80.5% do not accept), and the proportion of “To a large extent” answers is minimal (2.3% and 3.8%). This suggests that respondents consider the human factor to be an irreplaceable element in these roles.
We then examined which of the 15 areas provided wherein the respondents considered the use of AI acceptable (to a small extent vs. to a large extent). According to our data, we obtained a mean value of 6.91, a median of 7, and a mode of 0 (14.3%), which, together with a high standard deviation (4.7), indicates significant differences on a scale from 0 to 15. The high variability suggests that the assessment of the use of AI-based tools in religious areas is subjective and context-dependent. The data show different perceptions, which may require further research to understand the background.
During the research, we examined two specific hypotheses regarding the acceptance of AI tools in religious contexts. Hypothesis H5 states that the use of AI tools in the production of religious media content is more accepted than in activities related to spirituality or rituals. Hypothesis H6 states that the use of AI in religious education and research is also more accepted than in activities related to spirituality or rituals. To test these hypotheses, we analysed the data in Table 3, which shows the acceptance of AI tools in different religious functions, in percentage distribution. During the examination of hypothesis H5, we compared data related to the production of religious media content with functions related to spirituality and ceremonies. According to the table, in the case of “creation of text elements of religious media content” (e.g., sermons, mission messages), the acceptance rate is 29.4%, while the non-acceptance rate is 70.7%. In the case of “creation of visual, moving image elements of religious content”, the acceptance rate is 40.6%, while the non-acceptance rate is 59.4%. In the case of religious podcasts, we also experience 36.1% acceptance and 63.9% rejection. In contrast, the use of “spiritual guidance”—which is closely related to spirituality—shows only 12.8% acceptance, while the non-acceptance rate is exceptionally high, 87.2%. Similarly, “leading religious communities” (19.6% accepting, 80.5% not accepting) and “creating personalized religious messages” (26.3% accepting, 73.6% not accepting) also indicate low acceptance, which can be linked to rituals or deeper spiritual roles.
Based on the data, hypothesis H5 can be partially supported: acceptance is higher for media content production (especially visual and moving image elements) than for functions directly related to spirituality or rituals. However, the lower acceptance of textual media content (e.g., sermons) suggests that the nature of the content and the perception of religious authenticity may influence the assessment of AI tools.
In testing hypothesis H6, we compared the data measured in the “area of religious education” (43.6% accepting, 56.4% not accepting) and “scientific research of a theological nature” (54.1% accepting, 45.8% not accepting) with the functions related to spirituality and ceremonies. The acceptance of religious education and research significantly exceeds the indicators in the areas of “as a spiritual leader” (12.8% accepting) and “leading religious communities” (19.6% accepting). As a further comparison, “searching for religious information” (61.6% accepting) also shows a higher value, which can also be linked to the educational and research context. The results clearly support hypothesis H6: the use of AI tools in religious education and research is more accepted than in more emotionally or traditionally sensitive areas related to spirituality or ceremonies. This suggests that AI is accepted as a technical, intellectual support tool rather than a tool that affects the subjective dimensions of religious experience or ritual.

4.3. Gender Aspects of Perceptions of AI Use in Religious Communities

We then examined whether gender differences could be observed in the use of AI in religious communities. Our hypothesis (H1) was that men are more accepting of the use of their AI devices in religious contexts. The data obtained are summarized in Table 4.
To test the hypothesis (H1) that men are more accepting of the use of artificial intelligence (AI) tools, we measured them using the questions described in the previous section. Acceptance rates were calculated separately for men and women, and statistical tests (Chi-square test, Mann–Whitney U test, Spearman’s rank correlation) were performed to examine gender differences. We summarized how many of the 15 areas we mentioned considered the use of AI acceptable, and we obtained the following. Men accepted the use of AI in an average of 7.0714 (SD = 4.66654) areas, while women accepted the use of AI in 6.7922 (SD = 4.74713) areas. This indicates a small difference (0.2792 points) in favor of men. The p-value of the Mann–Whitney U test (p = 0.773) significantly exceeds the 0.05 significance level; hence, there is no statistically significant difference between the overall acceptance level of men and women.
Based on Table 4, we can conclude that men showed a higher acceptance rate in eight areas. In two cases, we could experience a difference exceeding 10%: planning events for religious communities (64.2% vs. 50.7%), creating and distributing personalized religious messages (33.9% vs. 20,8%). In contrast, women were more accepting in five areas, and we experienced a difference exceeding 10% in their case in one area compared to men: “creating religious podcasts” (41.6% vs. 28.5%). For tasks directly related to religious communities (e.g., updating email lists, newsletters, budgeting, event planning, leadership), men showed higher acceptance in all five areas. However, statistical analyses (Chi-square, Mann–Whitney U, Spearman) did not support the significance of gender differences in the individual areas we examined.
The overall results, therefore, suggest that men show slightly higher acceptance (7.0714 vs. 6.7922) in the use of AI in the religious field and are more often accepting in certain religious community applications (e.g., event planning, content creation), but these differences are not consistent and are not statistically significant. The lack of gender differences is consistent with previous research on technology acceptance, which often considers contextual factors (e.g., perceived usefulness, cultural norms) to be more decisive than demographic variables, such as gender (Venkatesh et al. 2003). In religious communities, attitudes towards AI are likely to be more strongly influenced by theological beliefs or perceptions of the use of technology in sacred and administrative roles. For example, both men and women showed high rejection rates for the use of AI as a “spiritual leader” (85.7% and 88.3%) and “in the field of religious community leadership” (78.6% and 81.9%), suggesting that both genders oppose AI in roles considered human or sacred. In contrast, there is greater acceptance for administrative tasks, such as updating email lists (67.8% men, 62.4% women), which is consistent with studies on the use of technology in organizational contexts (Davis 1989).

4.4. Perceptions of AI Use in Religious Communities by Type of Religiosity

The research also examined whether differences in the acceptance of AI use in religious communities can be observed along the type of religiosity, religious according to the teachings of their church (N = 101) versus religious in their own way (N = 32). According to hypothesis H2, religious people in their own way are more accepting of the use of AI for religious purposes than religious people in their church. Acceptance was measured in the previous 15 areas on a four-point Likert scale (“Not at all” = 1, “Rather not” = 2, “To a small extent” = 3, “To a large extent” = 4), with aggregate indicators (“Not accepting” = 1 + 2; “Accepting” = 3 + 4). The results are summarized in Table 5, which includes the p-values of the unpooled and pooled (2 × 2) Chi-square tests, as well as the results of Mann–Whitney U tests and Spearman rank correlations.
To test the hypothesis (H2), we measured acceptance rates in each of the 15 areas, and statistical tests (Chi-square test, Mann–Whitney U test, Spearman’s rank correlation) were performed to determine differences between the two groups. In summary, those who are religious according to their church’s teachings found the use of AI acceptable in an average of 6.2871 areas (SD = 4.47289), while those who are religious in their own way found the use of AI acceptable in 8.8750 areas (SD = 4.91705). The Mann–Whitney U test indicated a significant difference (p = 0.005), which supports that those who are religious in their own way are more accepting. The standard deviations (4.47289 vs. 4.91705) show similar variability in the two groups.
Based on Table 5, those religious in their own way are more accepting of AI in 14 out of 15 areas, except for searching for religious information (59.4% vs. 62.4%). In 12 areas, the difference exceeds 10%, and in six areas, it exceeds 20%: religious podcasts (59.4% vs. 28.7%), as spiritual leaders (34.3% vs. 6%), religious education (62.6% vs. 37.4%), creating visual and moving image elements of religious content (59.4% vs. 34.7%), in theological scientific research (71.9% vs. 48.5%) and in planning events for religious communities (71.9% vs. 51.5%). The unpaired Chi-square test indicated a problem in nine areas due to the low expected number of items (1–4 cells < 5). After the pairwise comparison (Non-accepting vs. Accepting), the Chi-square test showed significant differences in seven areas: (1) As a spiritual leader (p = 0.000), (2) In the field of religious education (p = 0.013), (3) In theological scientific research (p = 0.021), (4) In the planning of events for religious communities (p = 0.043), (5) In the field of religious community leadership (p = 0.015), (6) In the creation of visual and moving image elements of religious content (p = 0.013) and (7) In the production of religious podcasts (p = 0.002). The Mann–Whitney U test showed significant differences in 11 areas (p < 0.05): (1) As a live translator (p = 0.034), (2) As a spiritual leader (p = 0.000), (3) Religious education (p = 0.007), (4) Theological research (p = 0.009), (5) Budget planning (p = 0.048), (6) Event planning (p = 0.021), (7) Community leadership (p = 0.009), (8) Creating text elements of religious media (p = 0.046), (9) Creating visual and moving image elements of religious content (p = 0.005), (10) Creating podcasts (p = 0.003) and (11) Creating personalized religious messages (p = 0.005). Spearman’s rank correlation indicated a positive correlation in 10 areas, with significant relationships (p < 0.05): (1) Live translation (p = 0.032), (2) Spiritual leader (p = 0.000), (3) Religious education (p = 0.006), (4) Theological research (p = 0.008), (5) Event planning (p = 0.021), (6) Community leadership (p = 0.009), (7) Text elements (p = 0.027), (8) Visual content (p = 0.003, (9) Podcasts (p = 0.001), 10. Personalized messages (p = 0.004). In four areas, none of the tests showed a significant relationship: (1) music playlist playing, (2) religious community email list, (3) religious community newsletter, and (4) searching for religious information. The results confirm the H2 hypothesis, according to which those who are religious in their own way are significantly more accepting of the religious use of AI than those who are religious according to their church’s teachings. This is consistent with the subjective, individual nature of religiosity, which allows for more flexible technological adaptation, as opposed to the more conservative approach of institutional religiosity (Kozak and Fel 2024).
The largest differences are observed in spiritual and creative areas, such as in the functions of Spiritual Leader (6% vs. 34.3%, p = 0.000) and Religious Podcast Producer (28.7% vs. 59.4%, p = 0.002). This suggests that those who are religious according to their church’s teachings are more likely to reject AI in roles that require human presence, probably for theological or traditional reasons, while those who are religious in their own way are more open to such innovations. In administrative tasks, however, we no longer find significant differences. The standard deviations (4.47289 vs. 4.91705) indicate similar variability, suggesting that both groups have a diversity of attitudes. The more conservative attitude of those who are religious according to their church’s teachings may reflect a stronger attachment to religious norms, while the openness of those who are religious in their own way may stem from individual religious practices and technological tolerance. In summary, hypothesis H2 was confirmed: those who are religious in their own way are significantly more accepting of the religious use of AI, as supported by the pooled Chi-square, Mann–Whitney U, and Spearman tests.

4.5. Perceptions of the Authenticity of the Use of AI Tools in Different Religious Functions

We then examined how authentic respondents considered the use of AI tools in different religious functions. The data presented in Table 6 measures the perception of authenticity in 15 religious areas, broken down into percentages, according to the categories of “not at all” and “rather not” (not authentic), and “to a small extent” and “to a large extent” (authentic). In addition, Table 6 shows the difference between the responses to the question of “credible” and “acceptable”.
Based on the data in Table 6, we can observe that we obtain values like those in Table 3 (acceptance), but in 13 cases, the respondents considered the use of AI in the religious space to be less credible than acceptable. In one case, the same situation can be observed (playing a music list), and in the other, a slightly reversed situation can be observed around creating text elements of text media content. However, the extent of the difference cannot be considered high (maximum 6.1%). A difference of 5% or more can be observed in only three cases: (1) Planning the budget of religious communities (55.6% vs. 61.7%), (2) Planning events of religious communities (51.1% vs. 56.4%) and (3) Searching for religious information (56.4% vs. 61.6). It follows from the above that we also acquire a lower value in the case of the aggregated data. Thus, respondents consider the use of AI to be authentic in 6.4060 areas (vs. 6.91) on average. The proportion of 0 as the most common value is also higher (20.3% vs. 14.3%), meaning that the proportion of those who do not consider the use of AI to be authentic in any of the areas we mentioned has increased significantly. The higher standard deviation (4.9 vs. 4.7) indicates that the opinions of respondents are more divided in the case of authenticity than in the question of the usability of AI.

4.6. The Authenticity of the Use of AI Tools in the Life of Religious Communities as a Function of the Attitude Towards Religion

After that, we examined whether there is a difference in the use of AI in religious communities among those who are religious according to the teachings of their church and those who are religious in their own way. According to our hypothesis, those who are religious according to the teachings of the church consider the use of AI tools in the life of religious communities to be less authentic than those who are religious in their own way. Credibility was measured in the previous 15 areas on a four-point Likert scale (“Not at all” = 1, “Rather not” = 2, “To a small extent” = 3, “To a large extent” = 4), with aggregated indicators (“Not credible” = 1 + 2; “Credible” = 3 + 4). The results are summarized in Table 7, which includes the p-values of the unpooled and pooled (2 × 2) Chi-square tests, as well as the results of Mann–Whitney U tests and Spearman’s rank correlations.
The hypothesis (H3) that those who are religious according to the teachings of their church consider the use of AI tools in the life of religious communities to be less credible than those who are religious in their own way was measured using the above-mentioned questions. The proportion of those who consider it credible was examined in each of the 15 areas we examined, and statistical tests (Chi-square test, Mann–Whitney U test, Spearman’s rank correlation) were performed to determine the statistical difference between those who are religious according to the teachings of their church and those who are religious in their own way. In the summary, those who are religious according to the teachings of their church considered the use of AI to be credible in an average of 5.7525 areas (SD = 4.66992), while those who are religious in their own way considered the use of AI to be credible in 8.4688 areas (SD = 5.27309). The Mann–Whitney U test indicated a significant difference (p = 0.007), which statistically supports our claim that those who are religious in their own way consider the use of AI to be more credible in several areas than those who are religious according to the teachings of their church. In the case of the standard deviations (4.66992 vs. 5.27309), it can be seen that the opinions of those who practice their religion in their own way are more divided. Compared to the answers received on usability, we can see that both groups considered the use of AI to be credible in fewer areas. However, among those who are religious according to the teaching of their church, they consider the use of AI to be credible in significantly fewer areas than among those who are religious in their own way (8.4688 vs. 8.8750). In both groups, opinions regarding credibility varied to a greater extent, but the degree of dispersion was higher among those who were religious in their own way (5.27309 vs. 4.91705 and 4.66992 vs. 4.47289).
Based on Table 7, we can conclude that those who are religious according to the teachings of their church consider the use of AI tools to be less credible in all 15 areas than those who are religious in their own way. In 14 of these cases, the difference between the two groups exceeds 10%, and in six areas it exceeds 20%: 1. Religious education (32.7% vs. 59.4%), 2. Planning the budget of religious communities (49.5 vs. 75%), 3. Creating a religious podcast (25.7 vs. 50%), 4. Planning events for religious communities (45.5% vs. 68.8%), 5. A music playlist (55.4% vs. 78.2%), 6. Creating visual and moving image elements of religious content (33.7% vs. 56.2%). The Chi-square test without pooling again indicated a problem in nine areas due to the low expected frequencies (1–4 cells < 5). After pooling (Unauthentic vs. Authentic), the expected number of cases below 5 occurred in only two cases. The Chi-square tests showed a significant relationship in eight areas ((1) Music playlist p = 0.022, (2) As a spiritual leader p = 0.008, (3) Religious education p = 0.007, (4) Budgeting p = 0.011, (5) Religious content images p = 0.023, (6) Podcast p = 0.01, (7) Event planning p = 0.022, (8) Religious community leadership p = 0.017) of which in two cases the expected number of cases in a cell remained below five. In these cases, the Fisher test also showed a significant relationship; therefore, we consider them to be statistically significant differences: (1) As a spiritual leader, Fisher test p = 0.015; (2) Religious community leadership Fisher p = 0.022. The Mann–Whitney U test again showed significant differences in 11 areas (p < 0.05): (1) Music playlist (p = 0.023), (2) As a spiritual leader (p = 0.001), (3) In the field of religious education (p = 0.008), (4) In theological research (p = 0.024), (5) Budgeting of religious communities (p = 0.008), (6) Planning of religious community events (p = 0.008), (7) Leadership of religious communities (p = 0.002), (8) Textual elements of religious media content (p = 0.011), (9) Visual elements of religious content (p = 0.002), (10) Religious podcast (p = 0.003), (11) Creating and distributing personalized religious messages (p = 0.015). Spearman’s rank correlation showed a significant relationship in 12 areas. The three exceptions are as follows: (1) as live translation, (2) newsletter of religious communities, and (3) searching for religious information. If we look at the 15 areas one by one in three areas ((1) as live translation, (2) newsletter of religious communities, (3) searching for religious information), none of the indicators showed a significant relationship between the two groups examined. The results confirm the H3 hypothesis, according to which people who are religious according to the church’s teachings consider the use of AI tools in the life of religious communities to be less credible than people who are religious in their own way.

4.7. Perceptions of the Ethicality of the Use of AI Tools in Various Religious Functions

The following question examined the extent to which respondents consider the use of AI tools in various religious functions to be ethical. The data presented in Table 8 measures the perception of ethical use in 15 religious areas, broken down in percentages, according to the categories of “not at all” and “rather not” (not ethical), and “to a small extent” and “to a large extent” (ethical). In addition, Table 8 shows the difference between the answers given to the question “do you consider it ethical” and “acceptable”.
Ethicality is also an important factor in the use of AI in the religious field. The data in Table 8 shows that the proportion of those who consider it ethical does not differ significantly from those who consider AI tools acceptable (maximum 7.6%). In three areas, the proportion of those who consider it ethical is higher (music playlist 6.8%, live translation 1.5%, spiritual leader 0.8), in one case it is the same (making a religious podcast), and in the other cases the proportion of those who consider it ethical is lower (theological research 1.5%, personalized religious messages 1.5, leading religious communities 1.5%, creating text elements of religious media content 2.3%, visual elements of religious content 2.3%, religious education 3.8%, searching for religious information 5.2%, religious community newsletter 5.3%, planning religious community events 5.3%, updating email lists 6.1%, budget planning 7.6%). The number of areas where AI can be used ethically is 6.5789, lower than the number of areas considered acceptable (6.91), but higher than the average number of areas considered authentic (6.4060). The most common value for ethicality is also zero, but the percentage (17.3%) is between authentic (14.3%) and acceptable (20.3%). The standard deviation is essentially the same as that for authenticity (4.9); hence, the opinions of the respondents were similarly divided.

4.8. The Ethics of Using AI Tools in the Life of Religious Communities Depending on the Attitude Towards Religion

Finally, we examined whether there is a difference in the use of AI in religion among those who are religious according to the teachings of the church and those who are religious in their own way. According to our hypothesis, those who are religious according to the teachings of the church consider the use of AI tools in the life of religious communities to be less ethical than those who are religious in their own way. Ethicality was measured in the previous 15 areas on a four-point Likert scale (“Not at all” = 1, “Rather not” = 2, “To a small extent” = 3, “To a large extent” = 4), with aggregated indicators (“Unethical” = 1 + 2; “Ethical” = 3 + 4). The results are summarized in Table 9, which also includes the p-values of the unpooled and pooled (2 × 2) Chi-square tests, as well as the results of Mann–Whitney U tests and Spearman’s rank correlations.
The Hypothesis (H4) that religious people, according to their church teachings, consider the use of AI tools in the life of religious communities to be less ethical than those who are religious in their own way was measured using the above-mentioned questions. The proportions of those who consider it ethical were examined in each of the 15 areas we examined, and statistical tests (Chi-square test, Mann–Whitney U test, Spearman’s rank correlation) were performed to determine the statistical difference between religious people and those who are religious in their own way. In summary, religious people, according to their church teachings, considered the use of AI to be ethical in an average of 5.7327 areas (SD = 4.45621), while those who are religious in their own way considered the use of AI to be ethical in 9.2500 areas (SD = 5.25480). The Mann–Whitney U test indicated a significant difference (Z = −3.327, p = 0.001), which supports that those who are religious in their own way consider the use of AI to be ethical in more areas. In the case of the standard deviations (4.45621 vs. 5.25480), the opinions of those who practice their religion in their own way are more divided. Comparing the responses received on usability and ethics, we can see that the religiously religious, on average, consider the use of AI to be ethical and credible in almost the same areas, and this is significantly lower than the number of areas they mentioned as average. The religiously religious mentioned an average of 9.25 areas in which AI can be used ethically. This number significantly exceeds the average value of both the credible (8.4688) and acceptable (8.8750) areas.
Based on Table 9, we can conclude that churchly religious people consider the use of AI tools to be less ethical in all 15 areas compared to those who are religious in their own way. In only one case, we saw a difference of less than 10% (religious community newsletter 8.9%). In three cases, there was a difference of 10–20% (music playlist 13.7%, email list creation 17.4%, live translation 18.8%), in nine areas there was a difference of 20–30% (leadership of religious communities 21.5%, text elements of media content 22%, event planning 23.3%, spiritual leader 23.3%, budget planning 23.4%, personalized religious messages 25%, religious podcast 26.5%, religious information search 28.6%, theological scientific research 29.5%) and in two cases the difference between the two examined groups exceeded 30% (creation of visual elements of religious content 31.8%, religious education area 38.1%). The unpaired Chi-square test indicated a problem in 10 areas due to low expected frequencies (1–4 cells < 5). After the combination (Non-Ethical vs. Ethical), the expected number of cases below 5 occurred in only one case. Chi-square tests showed a significant relationship in 11 areas ((1) As a live translator p = 0.048, (2) Religious education p = 0.000, (3) Theological scientific research p = 0.004, (4) Budgeting p = 0.021, (5) Planning religious events p = 0.022, (6) Leading religious communities p = 0.006, (7) Creating religious media content textual content p = 0.012, (8) Religious content visual p = 0.001, (9) Podcast p = 0.006, (10) Personalized religious messages p = 0.003, (11) Searching for religious information p = 0.004) of which in one case the expected number of cases in a cell remained below five. The Fisher test showed a significant relationship here as well; therefore, we accept that in this case, a statistically significant difference ((1) Fisher test p = 0.000) can be observed as a spiritual leader. The Mann–Whitney U test showed significant differences in 12 areas (p < 0.05): 1. Music playlist (p = 0.044), (2) As a spiritual leader (p = 0.000), (3) In the field of religious education (p = 0.001), (4) In theological research (p = 0.001), (5) Email list of religious communities (p = 0.025), (6) Planning events of religious communities (p = 0.008), (7) Leading religious communities (p = 0.001), (8) Text elements of religious media content (p = 0.012), 9. Visual elements of religious content (p = 0.001), (10) Religious podcast (p = 0.006), (11) Creating and distributing personalized religious messages (p = 0.003), and 12. Searching for religious information (p = 0.006). Spearman’s rank correlation also showed a significant relationship in 12 areas. The three exceptions were as a live translator (p = 0.149), religious community newsletter (p = 0.090), and religious community budget (p = 0.055). If we look at the 15 areas one by one in one area (Religious community newsletter), none of the indicators showed a significant relationship between the two groups examined. The results confirm the H4 hypothesis, according to which people who are religious according to the church’s teachings consider the use of AI tools in the life of religious communities less ethical than people who are religious in their own way.

5. Summary

This study investigates the attitudes of Hungarian religious individuals toward the use of artificial intelligence (AI) in religious contexts, focusing on acceptance, credibility, and ethicality across various application areas. The findings, derived from a questionnaire survey of 133 respondents conducted between 10 February and 11 March 2025, are analyzed through the lens of Heidi Campbell’s Religious Social Shaping of Technology (RSST) model. Campbell’s RSST framework posits that religious communities negotiate technology adoption based on their core beliefs, values, and practices, shaping both the acceptance and application of technology within their contexts (Campbell 2010). The model emphasizes four key factors: (1) the community’s history and tradition, (2) core beliefs and values, (3) negotiation processes, and (4) communal boundaries and identity. These factors provide a robust framework for interpreting the study’s results, particularly the significant influence of religiosity type and the varying acceptance of AI across application domains.
The research examined the attitudes of Hungarians who are religious according to church teachings and religious in their own way of everyday life, along with 15 specific application options. The questionnaire survey conducted between 10 February and 11 March 2025 involved 133 relevant respondents (42% male, 58% female; 75.9% religious according to church teachings, 24.1% religious in their own way). The research tested six hypotheses that examined the acceptance, credibility, and ethics of AI according to gender differences, type of religiosity, and areas of application.
The results can be summarized below, with details per hypothesis.
H1. 
Gender differences—Men are more accepting of the use of AI tools in religious communities.
On average, men found the use of AI acceptable in 7.07 areas out of 15, while women found it acceptable in 6.79, indicating a slight difference in favor of men. The largest differences were in planning events for religious communities (men: 64.2%, women: 50.7%) and in creating and distributing personalized religious messages (men: 33.9%, women: 20.8%). However, statistical tests (Mann–Whitney U, p = 0.773; Chi-square, p > 0.05) did not show any significant differences. Thus, Hypothesis H1 was not confirmed: gender differences are not decisive in the acceptance of AI.
The hypothesis that men are more accepting of AI tools in religious communities was not confirmed, as statistical tests (Mann–Whitney U, p = 0.773; Chi-square, p > 0.05) revealed no significant gender differences. Men accepted AI in slightly more areas (7.07 vs. 6.79 for women), with notable differences in event planning (64.2% vs. 50.7%) and personalized religious messaging (33.9% vs. 20.8%). However, these differences were not statistically significant. From the RSST perspective, this finding suggests that gender does not significantly shape the negotiation of AI within religious communities. Instead, other factors, such as religious identity or doctrinal adherence, may override gender-based predispositions. Campbell’s model highlights that communal values and negotiation processes often take precedence over individual demographic characteristics, which aligns with the lack of significant gender differences in this context.
H2. 
Religious self-classification—Those who are religious in their own way are more accepting of the use of AI for religious purposes.
Those who are religious in their own way accepted AI in 8.88 areas, and those who are religious according to church teachings accepted AI in 6.29 areas, and the difference is significant (Mann–Whitney U, p = 0.005). The largest differences can be observed in the areas of spiritual guidance (34.3% vs. 6%), religious podcast (59.4% vs. 28.7%), and religious education (62.6% vs. 37.6%). Chi-square tests showed significant relationships in seven areas after pooling, while Mann–Whitney U and Spearman tests showed significant relationships in 10–11 areas. Hypothesis H2 was confirmed: those who are religious in their own way are significantly more accepting.
The hypothesis that individuals who are religious in their own way are more accepting of AI was confirmed (Mann–Whitney U, p = 0.005). Those identifying as religious in their own way accepted AI in 8.88 areas compared to 6.29 for those adhering to church teachings, with significant differences in spiritual guidance (34.3% vs. 6%), religious podcasts (59.4% vs. 28.7%), and religious education (62.6% vs. 37.6%). This finding aligns with Campbell’s RSST model, particularly the role of core beliefs and values. Individuals who are religious in their own way likely have a more individualized and flexible approach to faith, which may foster openness to innovative technologies like AI. In contrast, those adhering to church teachings are bound by institutional doctrines and traditions, which may impose stricter boundaries on technology adoption. The negotiation process within these communities reflects a tension between maintaining traditional religious authority and embracing technological innovation, with the former group showing greater adaptability.
H3. 
Credibility—Those who are religious according to their church’s teachings consider AI less credible in religious communities.
Church-religious people considered AI to be credible in 5.75 areas on average, and those who are religious in their own way in 8.47 areas (Mann–Whitney U, p = 0.007). Significant differences were found, for example, in the areas of religious education (32.7% vs. 59.4%) and podcasting (25.7% vs. 50%). Statistical analyses (Chi-square, p < 0.05 in eight areas; Spearman, p < 0.05 in 12 areas) support the hypothesis. H3 was confirmed: those who are religious according to their church’s teachings consider AI to be less credible.
The hypothesis that church-religious individuals consider AI less credible was confirmed (Mann–Whitney U, p = 0.007), with church-religious respondents finding AI credible in 5.75 areas compared to 8.47 for those religious in their own way. Significant differences appeared in areas such as religious education (32.7% vs. 59.4%) and podcasting (25.7% vs. 50%). From the RSST perspective, this reflects the influence of communal history and tradition. Church-religious individuals, rooted in established ecclesiastical structures, may perceive AI as a threat to the authenticity of religious practices, particularly in domains requiring human judgment or spiritual authority. Campbell’s model suggests that religious communities assess technologies based on their alignment with core values, such as the sanctity of human-led religious functions. For those religious in their own way, the absence of rigid doctrinal constraints allows for a more pragmatic view of AI’s credibility.
H4. 
Ethicality—Religious people in religious communities consider AI to be less ethical.
Church-religious people considered AI to be ethical in 5.73 areas, and those who are religious in their own way in 9.25 areas (Mann–Whitney U, p = 0.001). The largest differences were in the areas of religious education (30.7% vs. 68.8%) and visual content (30.7% vs. 62.5%). The Chi-square indicated a significant difference in 11 areas; Spearman indicated a significant difference in 12 areas (e.g., p = 0.000 for education). Hypothesis H4 was confirmed: church-religious people consider AI to be less ethical.
The hypothesis that church-religious individuals consider AI less ethical was confirmed (Mann–Whitney U, p = 0.001), with church-religious respondents deeming AI ethical in 5.73 areas compared to 9.25 for those religious in their own way. Significant differences were observed in religious education (30.7% vs. 68.8%) and visual content creation (30.7% vs. 62.5%). This finding aligns with Campbell’s emphasis on communal boundaries and identity. Church-religious communities often prioritize ethical frameworks grounded in tradition and human agency, viewing AI’s involvement in sacred or creative tasks as potentially undermining these values. In contrast, those religious in their own way, with less attachment to institutional norms, may perceive AI as a neutral tool, ethically permissible across a broader range of applications. The negotiation process here reflects a conservative stance among church-religious individuals, who prioritize maintaining doctrinal purity over technological integration.
H5. 
Religious media content production—More accepted than spirituality and rituals.
In media content production, acceptance is medium or low: text elements 29.4%, visual content 40.6%, podcast 36.1%. Functions related to spirituality (spiritual leader: 12.8%; community leadership: 19.6%) show lower acceptance. In the case of visual content, acceptance is higher, but not for text content (e.g., sermon). Therefore, H5 was partially confirmed: the acceptance of media content only exceeds the spiritual areas in certain cases.
The hypothesis that AI is more accepted in media content production than in spiritual or ritual functions was partially confirmed. Acceptance was moderate for media-related tasks (visual content: 40.6%, podcasts: 36.1%, text elements: 29.4%) but lower for spiritual functions (spiritual leadership: 12.8%, community leadership: 19.6%). Visual content showed higher acceptance, but text-based content (e.g., sermons) did not consistently outperform spiritual functions. According to Campbell’s RSST model, this pattern reflects the influence of core beliefs and communal boundaries. Media production, perceived as a technical or creative task, aligns more readily with secular applications of technology, making it less threatening to religious identity. Spiritual and ritual functions, however, are deeply tied to communal traditions and human agency, leading to lower acceptance due to concerns about authenticity and sacredness.
H6. 
Religious education and research—More accepted than spirituality and rituals.
The acceptance of religious education (43.6%) and theological research (54.1%) exceeds the indicators of spiritual leader (12.8%) and community leadership (19.6%). Information search (61.6%) also shows a high value. The statistical data clearly support the hypothesis H6: the acceptance of educational and research areas is significantly higher than that of spiritual functions.
The hypothesis that AI is more accepted in religious education and research than in spiritual or ritual functions was confirmed. Acceptance was higher for education (43.6%), theological research (54.1%), and information search (61.6%) compared to spiritual leadership (12.8%) and community leadership (19.6%). This finding aligns with Campbell’s RSST model, particularly the negotiation process and core values. Educational and research applications are perceived as technical or administrative, aligning with practical functions that do not challenge the sacred core of religious practice. In contrast, spiritual and ritual roles are central to religious identity and tradition, making AI’s involvement in these areas less acceptable. The higher acceptance of technical tasks reflects a communal negotiation that prioritizes maintaining the sanctity of spiritual roles while allowing technology in less sacred domains.
The acceptance of AI is higher in administrative (e.g., email list: 64.7%) and technical (e.g., translation: 65.4%) tasks, lower in spiritual (e.g., spiritual leadership: 12.8%) and creative (e.g., preaching: 29.4%) areas. The perception of authenticity (mean: 6.41) and ethics (mean: 6.58) is generally lower than acceptance (mean: 6.91), especially among the religiously religious.
The research showed that the type of religiosity strongly influences the perception of AI: those who are religious in their own way are more accepting, consider the use of AI in religious areas to be more authentic and ethical, while the religiously religious are more conservative. Gender differences were not found to be significant. The technical applications of AI are more accepted than their spiritual or creative roles, indicating the limits of technological adaptation in religious communities.
The study’s findings reveal a clear pattern: the type of religiosity strongly influences attitudes toward AI, with those religious in their own way demonstrating greater acceptance, credibility, and ethicality across more application areas. This aligns with Campbell’s RSST model, which emphasizes that religious communities shape technology based on their history, core beliefs, negotiation processes, and communal boundaries. Church-religious individuals, bound by institutional traditions and doctrinal frameworks, exhibit a conservative stance, perceiving AI as less credible and ethical, particularly in spiritual and creative roles. Their negotiation process prioritizes preserving communal identity and the sanctity of human-led religious practices, leading to lower acceptance of AI in sacred domains. In contrast, those religious in their own way, with more individualized beliefs, are less constrained by tradition, allowing for greater openness to AI across diverse applications. The higher acceptance of AI in administrative (e.g., email lists: 64.7%, translation: 65.4%) and technical tasks compared to spiritual (e.g., spiritual leadership: 12.8%) or creative roles (e.g., preaching: 29.4%) further supports Campbell’s framework. Technical tasks are perceived as peripheral to core religious values, making them more amenable to technological integration. Spiritual and creative roles, however, are central to religious identity, and their automation raises concerns about authenticity and ethics, particularly among church-religious individuals. The lower mean scores for authenticity (6.41) and ethics (6.58) compared to acceptance (6.91) underscore these concerns, reflecting a cautious negotiation process within religious communities.
The findings of this study, interpreted through Campbell’s RSST model, highlight the complex interplay between religious identity and technology adoption. The type of religiosity—whether church-based or individualized—shapes the negotiation of AI’s role in religious contexts, with church-religious individuals exhibiting greater resistance due to concerns over tradition, authenticity, and ethics. The acceptance of AI in technical and administrative tasks, as opposed to spiritual or creative ones, reflects communal boundaries that prioritize the preservation of sacred practices. These insights underscore the importance of considering religious values and communal dynamics when introducing technologies like AI into religious settings, as they significantly influence the extent and nature of technological adoption.
The study provides valuable insights into the attitudes of Hungarian religious individuals toward the application of artificial intelligence (AI) in religious contexts; however, several limitations must be acknowledged. First, the sample size of 133 respondents, while relevant, is relatively small and not representative of the broader Hungarian population or religious communities globally. This constrains the generalizability of the findings. Second, the research was geographically limited to Hungary, which may introduce cultural and contextual biases, as attitudes toward AI in religious settings may vary significantly across different countries and religious traditions. Third, the reliance on a questionnaire-based survey conducted between 10 February and 11 March 2025 may not fully capture the nuanced and evolving perceptions of AI, particularly as technological advancements and societal attitudes continue to develop rapidly. Additionally, the study focused on 15 specific application areas, which, while comprehensive, may not encompass all possible uses of AI in religious contexts, potentially overlooking other relevant domains. Finally, the self-reported nature of the data introduces the possibility of response bias, where participants’ answers may reflect social desirability or other subjective influences rather than their true beliefs.
To address these limitations and extend the scope of the current study, several avenues for future research are proposed. First, expanding the sample size and including a more diverse, international cohort of respondents would enhance the representativeness and cross-cultural applicability of the findings. Comparative studies across different religious traditions and geographic regions could elucidate how cultural and theological factors shape attitudes toward AI. Second, longitudinal research designs could be employed to track changes in perceptions of AI over time, particularly as religious communities gain greater exposure to AI technologies. Third, incorporating qualitative methods, such as in-depth interviews or focus groups, could provide richer insights into the reasons behind respondents’ attitudes, particularly regarding the perceived authenticity, credibility, and ethics of AI applications. Fourth, future studies could explore additional AI application areas, such as AI-driven pastoral care or interfaith dialogue facilitation, to broaden the understanding of its potential roles in religious contexts. Finally, investigating the impact of educational interventions or exposure to AI tools on religious individuals’ acceptance and perceptions could inform strategies for integrating AI ethically and effectively within religious communities. These directions would contribute to a more comprehensive understanding of the intersection between AI and religiosity, addressing both practical and theoretical implications.

Author Contributions

Conceptualization, M.A. and Z.R.; methodology, Z.R.; software, Z.R.; validation, M.A., Z.R. and S.D.; formal analysis, Z.R.; investigation, M.A., Z.R. and S.D. resources, M.A., Z.R. and S.D.; data curation Z.R.; writing—original draft preparation, M.A., Z.R. and S.D. writing—review and editing, M.A., Z.R. and S.D.; visualization, Z.R.; supervision, Z.R.; project administration, S.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of Pázmány Péter Catholic University Communication and Media Studies. The research was conducted in accordance with the Research Ethics Regulations of the Institute of Communication and Media Studies of Pázmány Péter Catholic University. The Research Ethics Permit was issued with the number PPKE KMI KE 2024/2.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data is in the article.

Acknowledgments

During the preparation of this manuscript/study, the author(s) used Google Translate for the purposes of checking English spelling. The authors have reviewed and edited the output and take full responsibility for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest.

Note

1
Faith and Algorithm Conference, 25 May 2025. Available online: https://happeningnext.com/event/hit-%C3%A9s-algoritmus-konferencia-mesters%C3%A9ges-intelligence-a-katolikus-k%C3%B6z%C3%B6ss%C3%A9gek-szolg%C3%A1lat%C3%A1ban-eid3a0cebt9os (accessed on 20 July 2025). Can artificial intelligence preach?—2025. 02.27—https://www.evangelikus.hu/hireink/teologia/de-hol-van-a-helye-a-szentleleknek-teologia-es-mesterseges-intelligence (accessed on 20 July 2025). Artificial intelligence–switching from follower mode to leader mode—27 April 2024. Available online: https://azeloigeeve.reformatus.hu/oktatas/hirek/mesterseges-intelligence-koveto-uzemmodbol-haladora-kaplpni/ (accessed on 20 July 2025).

References

  1. AbdelRahman, Fadwa, and Reeham R. Mohammed. 2023. AI in the Middle East: Balancing Cultural Identity, Gender Dynamics, and Religious Perspectives. In Thinking Tools an AI, Religion and Culture. Edited by Heidi A. Campbell and Pauline Hope-Cheong. London: Digital Religion Publicatios, pp. 17–20. [Google Scholar]
  2. AI Act. 2024. Available online: https://eur-lex.europa.eu/eli/reg/2024/1689/oj/eng (accessed on 24 March 2025).
  3. Alkhouri, Khader I. 2024. The role of artificial intelligence in the study of the psychology of religion. Religions 15: 290. [Google Scholar] [CrossRef]
  4. Almási, Zsolt. 2024. Human Agency Reloaded in our Techno-social Ecosystem. Contemporary Humanism Open Access Annals Studium 2024: 174–81. [Google Scholar]
  5. Amegbeha, Anne. 2023. Exploring the Intersection of AI, Religion, and Culture: Questions and Principles for Examination. In Thinking Tools an AI, Religion and Culture. Edited by Heidi A. Campbell and Pauline Hope-Cheong. London: Digital Religion Publicatios, pp. 27–30. [Google Scholar]
  6. Andok, Mónika. 2022. Digitális Vallás: Az új Információs Technológiák Hatása a Vallásokra, Vallási Közösségekre Magyarországon (Digital Religion: The Impact of New Information Technologies on Religions and Religious Communities in Hungary). Budapest: Ludovika Egyetemi Kiadó. [Google Scholar]
  7. Andriansyah, Yuli. 2023. The current rise of artificial intelligence and religious studies: Some reflections based on ChatGPT. Millah: Journal of Religious Studies 22: ix–xviii. [Google Scholar]
  8. Artificial Intelligence Strategy of Hungary. 2020–2030. Available online: https://cdn.kormany.hu/uploads/document/6/67/676/676186555d8df2b1408982bb6ce81c643d5fa4ab.pdf (accessed on 24 March 2025).
  9. Ashraf, Cameran. 2022. Exploring the impacts of artificial intelligence on freedom of religion or belief online. The International Journal of Human Rights 26: 757–91. [Google Scholar] [CrossRef]
  10. Astley, Jeff. 2002. In defence of ‘ordinary theology’. British Journal of Theological Education 13: 21–35. [Google Scholar] [CrossRef]
  11. Astley, Jeff. 2016. The analysis, investigation and application of ordinary theology. In Exploring Ordinary Theology. Abingdon-on-Thames: Routledge, pp. 1–9. [Google Scholar]
  12. Bagyinszki, Péter Ágoston. 2024. A “résztvevő megismerés” és a Nemzetközi Teológiai Bizottság 2011-es dokumentuma (“Participatory cognition” and the 2011 document of the International Theological Commission). SAPIENTIANA 17: 1–20. [Google Scholar]
  13. Barzilai-Nahon, Karine, and Gad Barzilai. 2005. Cultured technology: The Internet and religious fundamentalism. The Information Society 21: 25–40. [Google Scholar] [CrossRef]
  14. Bukovinszky-Csáki, Orsolya. 2025. Az újságírás és a mesterséges intelligencia. Generatív MIeszközök etikus használati módjai az újságírásban”. Korunk 4: 23–30. [Google Scholar]
  15. Campbell, Heidi A. 2010. When Religion Meets New Media. London and New York: Routledge Taylor and Francis Group. [Google Scholar]
  16. Campbell, Heidi A. 2013. Digital Religion. Understanding Religious Practice in Nem Media Worlds. London and New York: Routledge Taylor and Francis Group. [Google Scholar]
  17. Campbell, Heidi A. 2017. Surveying theoretical approaches within digital religion studies. New Media & Society 19: 15–24. [Google Scholar]
  18. Campbell, Heidi A. 2023. Evoking and Creating Theological Dialogue Around the AI-Nonhuman-Other for the Sake of Our Human-Technological Future. In Thinking Tools an AI, Religion and Culture. Edited by Heidi A. Campbell and Pauline Hope-Cheong. London: Digital Religion Publicatios, pp. 22–25. [Google Scholar]
  19. Cartella, Giuseppe, Vittorio Cuculo, Marcella Cornia, Marco Papasidero, Federico Ruozzi, and Rita Cucchiara. 2025. Pixels of faith: Exploiting visual saliency to detect religious image manipulation. In European Conference on Computer Vision. Cham: Springer, pp. 229–45. [Google Scholar]
  20. Cheong, Pauline Hope. 2020a. Religion, robots and rectitude: Communicative affordances for spiritual knowledge and community. Applied Artificial Intelligence 34: 412–31. [Google Scholar] [CrossRef]
  21. Cheong, Pauline Hope. 2020b. Robots, Religion and Communication. Religion in the Age of Digitalization: From New Media to Spiritual Machines. In Religion in the Age of Digitalization. Edited by Giulia Isetti, Elisa Innerhofer, Harald Pechlaner and Michael De Rachewiltz. New York: Routledge, pp. 84–95. [Google Scholar]
  22. Cheong, Pauline Hope. 2023. When Machines Need Humans: Considerations of Religious Human- Machine Communication and Bounded Religious Aitomation. In Thinking Tools an AI, Religion and Culture. Edited by Heidi A. Campbell and Pauline Hope-Cheong. London: Digital Religion Publicatios, pp. 13–15. [Google Scholar]
  23. Davie, Grace. 1990. Believing without belonging: Is this the future of religion in Britain? Social Compass 37: 455–69. [Google Scholar] [CrossRef]
  24. Davie, Grace. 1997. Believing without belonging: A framework for religious transmission. Recherches Sociologiques (Louvain-la-Neuve) 28: 17–37. [Google Scholar]
  25. Davis, Fred D. 1989. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly 13: 319–40. [Google Scholar] [CrossRef]
  26. Dietz, Ferenc. 2020. A mesterséges intelligencia az oktatásban: Kihívások és lehetőségek. Scientia Et Securitas 1: 54–63. [Google Scholar] [CrossRef]
  27. Digital Decade. 2025. Country Reports—Hungary. Available online: https://digital-strategy.ec.europa.eu/hu/library/digital-decade-2025-country-reports (accessed on 24 March 2025).
  28. Dornics, Szilvia. 2025. Hallgatói különbségek a mesterséges intelligencia tükrében. JEL-KÉP 4: 53–74. [Google Scholar] [CrossRef]
  29. Düzbayır, Bilal. 2025. A New Dimension in the Paradigm of Social Change: Artificial Intelligence and the Transformation of Religious Life. Eskişehir Osmangazi Üniversitesi İlahiyat Fakültesi Dergisi 12: 20–42. [Google Scholar] [CrossRef]
  30. Fernández, Víctor Manuel, Tolentino de Mendonça, José, Armando, Matteo and Tighe, Paul, Antiqua et Nova: Note on the Relationship Between Artificial Intelligence and Human Intelligence. 2025. Available online: https://www.vatican.va/roman_curia/congregations/cfaith/documents/rc_ddf_doc_20250128_antiqua-et-nova_en.html (accessed on 24 March 2025).
  31. Gyorgyovich, Miklós. 2023. A 2022-es népszámlálás felekezeti hovatartozás-mérésének margójára. Budapest: Századvég, pp. 121–35. [Google Scholar]
  32. Habib, Ahmed, and Hung Manh La. 2021. Evaluating the Co-dependence and Co-existence between Religion and Robots: Past, Present and Insights on the Future. International Journal of Social Robotics 13: 219–35. [Google Scholar]
  33. Haselager, Pim. 2024. From angels to artificial agents? AI as a mirror for human (im) perfection. Zygon 59: 661–787. [Google Scholar] [CrossRef]
  34. Horváth, László. 2023. Feltáró szakirodalmi áttekintés a mesterséges intelligencia oktatási használatáról. Pannon Digitális Pedagógia (E-Tanulás—Távoktatás—Oktatás-informatika) 3: 5–17. [Google Scholar] [CrossRef]
  35. Jackson, Joshua Conrad, Kai Chi Yam, Pok Man Tang, Ting Liu, and Azim Shariff. 2023. Exposure to Robot Preachers Undermines Religious Commitment. Journal of Experimental Psychology: General 152: 3344–58. [Google Scholar] [CrossRef]
  36. Kovács, tahyné Ágnes. 2024. A természet jogai jelenségről világi- és egyházi felfogásban. ProFuturo 14. [Google Scholar] [CrossRef]
  37. Kozak, Jaroslaw, and Stanislaw Fel. 2024. The Relationship between Religiosity Level and Emotional Responses to Artificial Intelligence in University Students. Religions 15: 331. [Google Scholar] [CrossRef]
  38. KSH. 2022. Népszámlálási Adatbázis. Available online: https://nepszamlalas2022.ksh.hu/adatbazis/ (accessed on 24 March 2025).
  39. Laurinyecz, Mihály. 2019. Az emberi méltóság fogalma napjaink bioetikai vitáinak érvelésrendszerében. In Istenképiség, Átistenülés, Emberi Méltóság: Teológiai Tanulmányok. Edited by Attila Puskás and László Perendy. Budapest: Szent István Társulat, pp. 214–31. [Google Scholar]
  40. MacKenzie, Donald, and Judy Wajcman. 1999. Introductary essay the social shaping of tchnology. In The Social Shaping of Technology, 2nd ed. Edited by Donald MacKenzie and Judy Wajcman. Buckingham: Open University Press. [Google Scholar]
  41. Mannerfelt, Frida, and Rikard Roitto. 2025. Preaching with AI: An exploration of preachers’ interaction with large language models in sermon preparation. Practical Theology 18: 127–38. [Google Scholar] [CrossRef]
  42. Marshall, P. David, Christopher Moore, and Kim Barbour. 2019. Persona Studies: An Introduction. Hoboken: John Wiley & Sons. [Google Scholar]
  43. Tomka, Miklós. 1999. A magyar vallási helyzet öt dimenziója. Magyar Tudomány 5: 549–59. [Google Scholar]
  44. Moore, Christopher, Kim Barbour, and Katja Lee. 2017. Five dimensions of online persona. Persona Studies 3: 1–11. [Google Scholar] [CrossRef]
  45. Nord, Ilona, Charles Ess, Jörn Hurtienne, and Thomas Schlag. 2023. Robotics in Christian religious practice: Reflections on initial experiments in this field. OPUS 2023: 1–35. [Google Scholar]
  46. Ocampo, Leonardo M., and Irene E. Gozum. 2023. AI in the academe: Opportunities and challenges for religious education. Religion and Social Communication 22: 2–22. [Google Scholar] [CrossRef]
  47. Ott, Kate. 2023. AI & the Historical Intractability of Human. In Thinking Tools an AI, Religion and Culture. Edited by Heidi A. Campbell and Pauline Hope-Cheong. London: Digital Religion Publicatios, pp. 35–38. [Google Scholar]
  48. Rahmi, Melvi, Dahyul Daipon, and Nelna Saprina. 2024. Ai Simulation of Digital Afterlife: Artificial Intelligence in the Framework of Islamic Law. ICMIL Proceedings 1: 167–76. [Google Scholar]
  49. Rajki, Zoltán. 2023. A mesterséges intelligencián alapuló alkalmazások a bölcsészet-, társadalomtudomány és az oktatás területén. Humán Innovációs Szemle 14: 4–21. [Google Scholar] [CrossRef]
  50. Rajki, Zoltán, Judit T. Nagy, and Ida Dringó-Horváth. 2024. A mesterséges intelligencia a felsőoktatásban. Iskolakultúra 347: 3–22. [Google Scholar] [CrossRef]
  51. Reed, Randall. 2021. AI in Religion, AI for Religion, AI and Religion: Towards a theory of religious studies and artificial intelligence. Religions 12: 401. [Google Scholar] [CrossRef]
  52. Rendsburg, Melissa A. 2019. The Impact of Artificial Intelligence on Religion: Reconciling a New Relationship with God. Political Science-United Nations & Global Policy Studies. Available online: https://d1wqtxts1xzle7.cloudfront.net/109479700/The_Impact_of_Artificial_Intelligence_on_Religion-libre.pdf?1703372263=&response-content-disposition=inline%3B+filename%3DThe_Impact_of_Artificial_Intelligence_on.pdf&Expires=1753790507&Signature=XOgDUTgI~K (accessed on 24 March 2025).
  53. Sierocki, Radoslaw. 2024. Algorithms and Faith: The Meaning, Power, and Causality of Algorithms in Catholic Online Discourse. Religions 15: 431. [Google Scholar] [CrossRef]
  54. Simmerlein, Jonas. 2024. Sacred Meets Synthetic: A Multi-Method Study on the First AI Church Service. Review of Religious Research 67: 126–45. [Google Scholar] [CrossRef]
  55. Singler, Beth. 2023. How I Stopped Worrying and Learned to Question the Apocaliptyc AI. In Thinking Tools an AI, Religion and Culture. Edited by Heidi A. Campbell and Pauline Hope-Cheong. London: Digital Religion Publicatios, pp. 10–12. [Google Scholar]
  56. Tampubolon, Manotar, and Bernadetha Nadeak. 2024. Artificial intelligence and understanding of religion: A moral perspective. International Journal of Multicultural and Multireligious Understanding 11: 903–14. [Google Scholar]
  57. Temperman, Jeroen. 2023. Artificial Intelligence and Religious Freedom. In Artificial Intelligence and Human Rights. Oxford: Oxford University Press, pp. 61–75. [Google Scholar]
  58. Trotta, Susanna, Deborah Sabrina Iannotti, and Boris Rähme. 2024. Religious actors and artificial intelligence: Examples from the field and suggestions for further research. Religion and Development 1.aop: 1–25. [Google Scholar]
  59. Tsuria, Ruth, and Yossi Tsuria. 2024. Artificial intelligence’s understanding of religion: Investigating the moralistic approaches presented by generative artificial intelligence tools. Religions 15: 375. [Google Scholar] [CrossRef]
  60. Turós, Mátyás, Róbert Nagy, and Zoltán Szűts. 2025. What percentage of secondary school students do their homework with the help of artificial intelligence?—A survey of attitudes towards artificial intelligence. Computers and Education: Artificial Intelligence 8: 100394. [Google Scholar] [CrossRef]
  61. Ty, Rey. 2023. Impact of AI-powered technology on religious practices and ethics: The road ahead. Religion and Social Communication Journal, 2–21. [Google Scholar] [CrossRef]
  62. Vaughan, Geoff, Jinil Yoo, and Rita Szűts-Novak. 2025. Wisdom of the Heart: A Contemporary Review of Religion and AI. Religions 16: 834. [Google Scholar] [CrossRef]
  63. Venkatesh, Viswanath, Michael G. Morris, Gordon B. Davis, and Fred D. Davis. 2003. User acceptance of information technology: Toward a unified view. MIS Quarterly 27: 425–78. [Google Scholar] [CrossRef]
  64. Verma, Koyal. 2023. Religion, Culture and Artificial Intelligence: A Critical Inquiry into Identity Formation in the Global World. In Thinking Tools an AI, Religion and Culture. Edited by Heidi A. Campbell and Pauline Hope-Cheong. London: Digital Religion Publicatios, pp. 48–50. [Google Scholar]
Table 1. Hypotheses and Corresponding Statistical Analyses.
Table 1. Hypotheses and Corresponding Statistical Analyses.
Hypothesis VariablesMain Statistical TestsUsed Justification
H1Gender × acceptance of AI (15 items)Pearson’s Chi-square, Mann–Whitney U, Spearman’s rhoWe tested gender-based differences in AI acceptance using categorical and ordinal tests due to the non-normal distribution of Likert data.
H2Type of religiosity × acceptance of AIMann–Whitney U, Spearman’s rhoDifferences between self-defined and church-defined religiosity in AI acceptance were analyzed using non-parametric tests appropriate for ordinal responses.
H3Type of religiosity × authenticity scoresMann–Whitney U, Spearman’s rhoWe examined perceived authenticity of AI use across religiosity types with ordinal, non-parametric tests due to the nature of the data.
H4Type of religiosity × ethicality scoresMann–Whitney U, Spearman’s rhoEthical perceptions were tested between the two religiosity groups using ordinal-level, non-parametric methods.
H5Function type: media content vs. ritual/spiritual.Descriptive comparison of acceptance ratesDescriptive comparison of acceptance rates. Acceptance rates were descriptively compared between media-related and spiritual/ritual functions based on percentage differences.
H6Function type: education/research vs. ritual/spiritualDescriptive comparison of acceptance rates.Descriptive comparison of acceptance rates. Acceptance of AI in education and research was compared to spiritual/ritual domains using descriptive percentages.
Table 2. Do you know your church’s teaching on Artificial Intelligence?
Table 2. Do you know your church’s teaching on Artificial Intelligence?
Do you know your Church’s Teaching on Artificial Intelligence?Total
YesNo
Which church do you identify with?CatholicCount276188
%30.7%69.3%100.0%
OtherCount103545
%22.2%77.8%100.0%
TotalCount3796133
%27.8%72.2%100.0%
Table 3. Do you consider the use of AI tools in the following religious functions acceptable? Data in percentages.
Table 3. Do you consider the use of AI tools in the following religious functions acceptable? Data in percentages.
Not at AllRather NotTo a Small ExtentTo a Large ExtentNot AcceptingAccepting
In the form of a music playlist21.11839.821.139.160.9
As a live translation16.51828.636.834.565.4
As a spiritual leader75.21210.52.387.212.8
In the field of religious education33.822.633.89.856.443.6
In the field of scientific research of a theological nature26.319.535.318.845.854.1
In the field of updating the email list of religious communities20.31533.131.635.364.7
In the field of religious community newsletters20.321.836.821.142.157.9
In planning the budget of religious communities21.117.334.627.138.461.7
In planning events of religious communities22.621.141.41543.756.4
In the field of religious community leadership51.928.615.83.880.519.6
In creating text elements of religious media content (sermons, mission messages)50.420.321.18.370.729.4
In creating visual and moving image elements of religious content36.822.633.17.559.440.6
In creating religious podcasts39.124.826.39.863.936.1
In creating and distributing personalized religious messages55.61818.87.573.626.3
In searching for religious information20.31833.827.838.361.6
Table 4. Do you consider the use of AI devices in the following religious functions acceptable? Broken down by gender. Data in percentages.
Table 4. Do you consider the use of AI devices in the following religious functions acceptable? Broken down by gender. Data in percentages.
Not at AllRather NotTo a Small ExtentTo a Large ExtentNot AcceptingAcceptingChi-Square TestMann–Whitney U TestSpearman
ManWomenMWMWMWMWMW
In the form of a music playlist2518.219.616.930.446.82518.244.635.155.465p = 0.289p = 0.676ρ = 0.036
p = 0.678
As a live translation19.614.312.522.12531.242.932.532.136.467.963.7p = 0.309p = 478ρ = −0.062
p = 0.480
As a spiritual leader71.477.914.310.48.911.75.4085.788.314.311.7p = 0.175p = 368ρ = −0.078
p = 0.370
In the field of religious education35.732.521.423.433.933.88.910.457.155.942.844.2p = 0.973p = 742ρ = 0.029
p = 0.743
In the field of scientific research of a theological nature2527.321.418.232.137.721.416.946.445.553.554.6p = 0.832p = 0.762ρ = −0.026
p = 0.763
In the field of updating the email list of religious communities23.218.28.919.535.731.232.131.232.137.767.862.4p = 0.389p = 0.877ρ = −0.013
p = 0.878
In the field of religious community newsletters23.218.216.12641.133.819.622.139.344.260.755.9p = 0.489p = 0.945ρ = 0.006
p = 0.945
In planning the budget of religious communities19.622.112.520.835.733.832.123.432.142.967.857.2p = 0.506p = 0.224ρ = −0.106
p = 0.225
In planning events of religious communities21.423.414.32644.63919.611.735.749.464.250.7p = 287p = 0.153ρ = −0.124
p = 0.154
In the field of religious community leadership55.449.423.232.514.316.97.11.378.681.921.418.2p = 0.233p = 0.809ρ = 0.021
p = 0.810
In creating text elements of religious media content (sermons, mission messages)51.849.421.419.516.124.710.76.573.268.926.831.2p = 0.586p = 0.807ρ = 0.021
p = 0.808
In creating visual and moving image elements of religious content37.536.419.627.337.522.15.414.357.163.742.936.4p = 0.114p = 0.676ρ = 0.007
p = 0.938
In creating religious podcasts35.741.635.716.919.631.28.910.471.458.528.541.6p = 0.085p = 0.723ρ = 0.031
p = 0.725
In creating and distributing personalized religious messages53.657.112.522.123.215.610.75.266.179.233.920.8p = 0.257p = 0.331ρ = −0.085
p = 0.333
In searching for religious information17.922.119.616.935.732.526.828.637.53962.561.1p = 0.906p = 0.896ρ = −0.011
p = 0.897
Table 5. Do you consider the use of AI tools in the following religious functions acceptable? Data by church and self-religion in percentage.
Table 5. Do you consider the use of AI tools in the following religious functions acceptable? Data by church and self-religion in percentage.
Not at AllRather NotTo a Small ExtentTo a Large ExtentNot AcceptingAcceptingChi-Square Test (Without Pooling)Chi-Square Test (After Pooling)Mann–Whitney U TestSpearman
Religious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own Way
In the form of a music playlist23.812.519.812.536.65019.82543.62556.475p = 0.31p = 0.061p = 0.103ρ = 0.144
p = 0.098
As a live translation19.86.218.815.628.728.132.75038.621.861.478.1p = 0.186p = 0.083p = 0.034ρ = 0.187
p = 0.032
As a spiritual leader84.246.99.918.8528.116.294.165.7634.3p= 0.000p = 0.000p = 0.000ρ = 0.396
p = 0.0000
In the field of religious education38.618.823.818.830.743.86.918.862.437.437.662.6p = 0.05p = 0.013p = 0.007ρ = 0.238
p = 0.006
In the field of scientific research of a theological nature31.79.419.818.832.743.815.828.151.528.248.571.9p = 0.06 és Likelihood p = 0.041p = 0.021p = 0.009ρ = 0.231
p = 0.008
In the field of updating the email list of religious communities22.812.514.915.635.62526.746.937.728.162.371.9p = 0.156p = 0.327p = 0.060ρ = 0.154
p = 0.076
In the field of religious community newsletters21.815.621.821.937.634.418.828.143.637.556.462.5p = 0.677p = 0.545p = 0.307ρ = 0.089
p = 0.308
In planning the budget of religious communities23.812.517.815.635.631.222.840.641.628.158.471.8p = 0.210p = 0.172p = 0.048ρ = 0.167
p = 0.054
In planning events of religious communities25.712.522.815.639.646.911.92548.528.151.571.9p = 0.134p = 0.043p = 0.021ρ = 0.200
p = 0.021
In the field of religious community leadership57.434.427.731.211.928.136.285.165.614.934.3p = 0.062p = 0.015p = 0.009ρ = 0.227
p = 0.009
In creating text elements of religious media content (sermons, mission messages)54.537.518.82522.815.6421.973.362.526.837.5p= 0.008p = 0.244p = 0.046ρ = 0.192
p = 0.027
In creating visual and moving image elements of religious content41.621.923.818.830.740.6418.865.440.734.759.4p= 0.014p = 0.013p = 0.005ρ = 0.255
p = 0.003
In creating religious podcasts43.62527.715.622.837.55.921.971.340.628.759.4p = 0.008p = 0.002p = 0.003ρ = 0.273
p = 0.001
In creating and distributing personalized religious messages62.434.414.928.118.818.8418.877.362.522.837.6p= 0.005p = 0.099p = 0.005ρ = 0.246
p = 0.004
In searching for religious information21.815.615.82537.621.924.837.537.640.662.459.4p = 0.188p = 0.761p = 0.444ρ = 0.063
p = 0.473
Table 6. Do you consider the use of AI tools in the following religious functions to be authentic? Data in percentages.
Table 6. Do you consider the use of AI tools in the following religious functions to be authentic? Data in percentages.
Not at AllI Rather NotTo a Small ExtentTo a Large ExtentNot AuthenticAuthenticAcceptable (Table 3)Difference (Authentic–Acceptable)
In the form of a music playlist22.616.539.821.139.160.960.90
As a live translation21.115.838.324.836.863.265.4−2.2
As a spiritual leader77.412.87.52.390.29.812.8−3
In the field of religious education36.824.131.67.560.939.143.6−4.5
In the field of scientific research of a theological nature30.120.334.61550.449.654.1−4.5
In the field of updating the email list of religious communities24.814.333.127.839.160.964.7−3.8
In the field of religious community newsletters27.119.536.117.346.653.457.9−4.5
In planning the budget of religious communities26.31835.320.344.455.661.7−6.1
In planning events of religious communities27.821.138.312.848.951.156.4−5.3
In the field of religious community leadership58.626.3123851519.6−4.6
In creating text elements of religious media content (sermons, mission messages)51.118.824.85.369.930.129.40.7
In creating visual and moving image elements of religious content36.824.128.610.560.939.140.6−1.5
In creating religious podcasts38.330.123.38.368.431.636.1−4.5
In creating and distributing personalized religious messages58.616.518.8675.224.826.3−1.5
In searching for religious information24.818.835.321.143.656.461.6−5.2
Table 7. To what extent do religious people, according to church teachings and self-religionists, consider the use of the AI tool in the following religious functions to be authentic? Data in percentages.
Table 7. To what extent do religious people, according to church teachings and self-religionists, consider the use of the AI tool in the following religious functions to be authentic? Data in percentages.
Not at AllRather NotTo a Small ExtentTo a Large ExtentNot AuthenticAuthenticChi-Square Test (Without Pooling)Chi-Square Test (After Pooling)Mann–Whitney U TestSpearman
Religious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own Way
In the form of a music playlist23.818.820.83.138.643.816.834.444.621.955.478.2p = 0.036p = 0.022p = 0.023ρ = 0.197
p = 0.023
As a live translation23.812.516.812.536.643.822.831.240.62559.475p = 0.427p = 0.111p = 0.111ρ = 0.139
p = 0.111
As a spiritual leader84.256.29.921.9515.616.294.178.15.921.8p = 0.008p = 0.008
Fischer test: p = 0.015
p = 0.001ρ = 0.293
p = 0.001
In the field of religious education41.621.925.718.826.746.95.912.567.340.632.759.4p= 0.056p = 0.007p = 0.008ρ = 0.229
p = 0.008
In the field of scientific research of a theological nature34.715.619.821.933.737.511.92554.537.545.562.5p = 0.119p = 0.095p = 0.024ρ = 0.196
p = 0.024
In the field of updating the email list of religious communities28.712.512.918.833.731.224.837.541.631.258.468.8p = 0.201p = 0.296p = 0.093ρ = 0.146
p = 0.024
In the field of religious community newsletters29.718.819.818.836.634.413.928.149.537.550.562.5p = 0.263p = 0.236p = 0.085ρ = 0.150
p = 0.085
In planning the budget of religious communities29.715.620.89.433.740.615.834.450.52549.575p = 0.045p = 0.011p = 0.008ρ = 0.232
p = 0.007
In planning events of religious communities30.718.823.812.537.640.67.928.154.531.245.568.8p = 0.015p = 0.022p = 0.008ρ = 0.232
p = 0.007
In the field of religious community leadership65.337.523.834.49.918.819.489.171.910.928.1p = 0.009p= 0.017
Fisher: p = 0.024
p = 0.002ρ = 0.265
p = 0.002
In creating text elements of religious media content (sermons, mission messages)56.434.417.821.923.828.1215.674.356.225.743.8P = 0.010p = 0.053p = 0.011ρ = 0.221
p = 0.011
In creating visual and moving image elements of religious content41.621.924.821.929.725431.266.343.833.756.2p = 0.000p = 0.023p = 0.002ρ = 0.268
p = 0.002
In creating religious podcasts43.621.930.728.121.828.1421.974.35025.750p = 0.005p = 0.01p = 0.003ρ = 0.261
p = 0.002
In creating and distributing personalized religious messages64.440.613.92518.818.8315.678.265.621.834.4p = 0.014p = 0.151p = 0.015ρ = 0.212
p = 0.014
In searching for religious information27.715.617.821.937.628.116.834.445.537.554.562.5p = 0.1p = 0.424p = 0.084ρ = 0.150
p = 0.084
Table 8. Do you consider the use of AI tools in the following religious functions to be ethical? Data in percentages.
Table 8. Do you consider the use of AI tools in the following religious functions to be ethical? Data in percentages.
Not at AllRather NotTo a Small ExtentTo a Large ExtentUnethicalEthicalAccepting (Table 3)Difference Between Ethical and Accepting
In the form of a music playlist19.512.835.332.332.367.760.96.8
As a live translation18.814.333.133.833.166.965.41.5
As a spiritual leader72.913.58.35.386.413.612.80.8
In the field of religious education35.324.830.19.860.239.843.6−3.8
In the field of scientific research of a theological nature28.618.837.61547.452.654.1−1.5
In the field of updating the email list of religious communities25.615.827.830.841.458.664.7−6.1
In the field of religious community newsletters24.123.333.818.847.452.657.9−5.3
In planning the budget of religious communities24.821.130.823.345.954.161.7−7.6
In planning events of religious communities23.325.633.817.348.951.156.4−5.3
In the field of religious community leadership57.924.112.85.3821819.6−1.6
In creating text elements of religious media content (sermons, mission messages)50.422.620.36.872.927.129.4−2.3
In creating visual and moving image elements of religious content39.122.628.69.861.738.340.6−2.3
In creating religious podcasts39.824.126.39.863.936.136.10
In creating and distributing personalized religious messages55.619.5159.875.224.826.3−1.5
In searching for religious information25.61836.120.343.656.461.6−5.2
Table 9. To what extent do religious and self-religionists consider the use of AI tools in the following religious functions to be ethical? Data in percentages.
Table 9. To what extent do religious and self-religionists consider the use of AI tools in the following religious functions to be ethical? Data in percentages.
Not at AllRather NotTo a Small ExtentTo a Large ExtentUnethicalEthicalChi-Square Test (Without Pooling)Chi-Square Test (After Pooling)Mann–Whitney U TestSpearman
Religious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own WayReligious in a Church WayReligious in His Own Way
In the form of a music playlist21.812.513.99.436.631.227.746.935.621.964.478.1p = 0.220p = 0.147p = 0.044ρ = 0.175
p = 0.044
As a live translation21.89.415.89.429.743.832.737.537.618.862.481.2p = 0.229p = 0.048p = 0.148ρ = 0.126
p = 0.149
As a spiritual leader80.25011.918.8518.8312.592.168.87.931.2p = 0.003p = 0.001p = 0.000ρ = 0.308
p = 0.000
In the field of religious education38.62530.76.225.743.852569.331.230.768.8p = 0.000p = 0.000p = 0.001ρ = 0.287
p = 0.001
In the field of scientific research of a theological nature32.715.621.89.435.643.89.931.254.52545.575p = 0.007p = 0.004p = 0.001ρ = 0.278
p = 0.001
In the field of updating the email list of religious communities28.715.616.812.528.72525.746.945.528.154.571.9p = 0.135p = 0.081p = 0.025ρ = 0.195
p = 0.024
In the field of religious community newsletters25.718.823.821.936.62513.934.449.540.650.559.4p = 0.074p = 0.381p = 0.090ρ = 0.147
p = 0.090
In planning the budget of religious communities25.721.925.76.228.737.519.834.451.528.148.571.9p = 0.061p = 0.021p = 0.055ρ = 0.167
p = 0.055
In planning events of religious communities26.712.527.718.832.737.512.931.254.531.245.568.8p = 0.051p = 0.022p = 0.008ρ = 0.232
p = 0.007
In the field of religious community leadership65.334.421.831.210.918.8215.687.165.612.934.4p = 0.002p = 0.006p = 0.001ρ = 0.298
p = 0.000
In creating text elements of religious media content (sermons, mission messages)54.537.523.818.819.821.9221.978.256.221.843.8p = 0.001p = 0.015p = 0.012ρ = 0.218
p = 0.012
In creating visual and moving image elements of religious content43.62525.712.526.734.4428.169.337.530.762.5p = 0.000p = 0.001p = 0.001ρ = 0.288
p = 0.001
In creating religious podcasts43.628.126.715.624.831.252570.343.829.756.2p = 0.004p = 0.006p = 0.006ρ = 0.237
p = 0.006
In creating and distributing personalized religious messages61.437.519.818.813.918.852581.256.218.843.8p = 0.005p = 0.004p = 0.003ρ = 0.259
p = 0.003
In searching for religious information29.712.520.89.432.746.916.831.250.521.949.578.1p = 0.038p = 0.004p = 0.006ρ = 0.240
p = 0.005
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Andok, M.; Rajki, Z.; Dornics, S. The Use of Artificial Intelligence Tools for Religious Purposes: Empirical Research Among Hungarian Religious Communities. Religions 2025, 16, 999. https://doi.org/10.3390/rel16080999

AMA Style

Andok M, Rajki Z, Dornics S. The Use of Artificial Intelligence Tools for Religious Purposes: Empirical Research Among Hungarian Religious Communities. Religions. 2025; 16(8):999. https://doi.org/10.3390/rel16080999

Chicago/Turabian Style

Andok, Mónika, Zoltán Rajki, and Szilvia Dornics. 2025. "The Use of Artificial Intelligence Tools for Religious Purposes: Empirical Research Among Hungarian Religious Communities" Religions 16, no. 8: 999. https://doi.org/10.3390/rel16080999

APA Style

Andok, M., Rajki, Z., & Dornics, S. (2025). The Use of Artificial Intelligence Tools for Religious Purposes: Empirical Research Among Hungarian Religious Communities. Religions, 16(8), 999. https://doi.org/10.3390/rel16080999

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop