The Hermeneutics of Artificial Text †

. Abstract: Spectacular achievements of the so-called large language models (LLM), a technical solution that has emerged within natural language processing (NLP), are a common experience these days. In particular, this applies to the artificial text generated in various ways by these models. This text represents a level of semantic perfection comparable to that of or even equal to a human. On the other hand, there is extensive and old research on the role and meaning of the text in human culture and society, with a very rich philosophical background gathered in the field of hermeneutics. The paper justifies the necessity of using the research background of hermeneutics to study artificial texts and also proposes the first conclusions about these texts in the context of this background. It is the formulation of foundations of the research area that can be called the hermeneutics of artificial text.


Introduction
The writing that is the basis of the text, Ong claims, developed in Mesopotamia around 3500 BC.Plato, through the mouth of Socrates in the dialogue Phaedrus, expresses many objections to writing.He claims that it creates the appearance of the existence of what he describes and, moreover, remains mute and immovable, depriving the spoken word of its mobility and interchangeability.It also weakens the mind, especially memory [1].Ong points out that Plato makes these remarks with the writing that has allowed them to survive for centuries and regards him as one of the last proponents of oral culture [2].However, writing, contrary to Plato, turned out to be one of the most important inventions of mankind.It has been able to bear the wealth of possibilities that language brings, and in addition, it has entered into a deep relationship with it.In the plan on which the argument is presented here, they can be identified, although their separateness is also examined, e.g., [3].
Writing perceived from a technological perspective has been called text because it materializes as text, although there may be different materializations.The transmission of successive incarnations allows, at the basic level that we are at, a smooth passage from language, through writing, to text, although each of these fields, especially the two extreme ones, has become the subject of extensive and separate research.As part of this transmission, one of the most important historical assumptions made for language in the context of the possibility of its technical machine imitation should be recalled at the outset.They were set by René Descartes in the treatise entitled A Discourse on the Method from 1637.This treatise is important primarily because it became one of the foundations of the civilizational project of the Western world, the axis and central point of which is man.Machines, as Descartes wrote, "would never be able to use words or other signs by composing them as we do to declare our thoughts to others.For we can well conceive of a machine made in such a way that it emits words, and even utters them about bodily actions which bring about some corresponding change in its organs (. ..); but it is not conceivable that it should put these words in different orders to correspond to the meaning of things said in its presence, as even the most dull-witted of men can do" [4] (p. 46).Events related to the development of the technology called large language models (LLM), taking place in the field of natural language processing (NLP), belonging to the area of the so-called artificial intelligence, completely contradicted Descartes' strong position.
According to the definition given by Jurafsky and Martin, the language model is a model "that assign probabilities to sequences of words" [5] (p.30).A similar definition was provided by Goodfellow et al.: "A language model defines a probability distribution over sequences of tokens in a natural language" [6] (p. 46).A wide variety of different language modeling, as Jurafsky and Martin write, appeared in the 1980s and 1990s, but their real bloom has taken place in recent years in the form of the so-called neural network language models, which developed into the so-called large language models (LLM).
Zhao et al. present a comprehensive survey of LLM [7], assuming the years 2019-2023 as the study period.It shows the rapid development of LLMs, consisting in increasing their complexity associated with enlarging the corpus of texts on which these models are trained.They also present the historical development of language models, which consists of four phases.First, the so-called statistical language models (SLM) appeared, which used the idea of determining the probability of word distribution, but this was done by direct calculations in the text and required too many dimensions.This defect was eliminated by neural language models (NLM) based on the distributional hypothesis, whose proponents were published in the 1950s; these included mainly Martin Joos [8], Zellig S. Harris [9], and John R. Firth [10].Technically, this hypothesis was implemented by the solution finally proposed by Mikolov et al. [11].It consisted of obtaining the so-called embeddings, i.e., dense vectors, which are single semantic units and, at the same time, dimensions of an abstract semantic space.The embeddings technique allowed us to successfully cross the border of semantic text parameterization and became the basis for all subsequent language models.Other significant innovations allowed for the construction of the so-called pre-trained language models (PLM) based on the idea of attention proposed by Vaswani et al. [12].This idea became the basis for the widely used transformers technology, which consisted of increasing the possibility of identifying and formalizing deeper semantic inferences in the text.This stage led to the creation of models with surprisingly spectacular semantic capabilities, such as generating intelligible text, i.e., one that presents high semantic complexity.The last stage was the so-called large language models (LLM), which differ from their predecessors in the proposed scale of the solution, regarding the number of parameters of the neural networks used, as well as the volume of the body of texts that were the basis for their training.
Against the background of all language models, the solutions presented by OpenAI represented a special position, i.e., subsequent GPT (generative pre-trained transformer) models, which represented subsequent phases of the development of language models, being at the same time the state-of-the-art solutions.At the same time, these models provided less and less technical information about themselves, starting with open solutions in the case of GPT-2 [13] and through closed-source solutions in the case of GPT-3 [14].The last GPT-4 model [15] does not even provide basic technical information, i.e., the number of parameters.
The GPT-4 model presents an extremely high level of semantic quality.This is confirmed by the evaluation performed by the manufacturer [15] and external research [16].As its creators write: "We characterize GPT-4, a large multimodal model with human-level performance on certain difficult professional and academic benchmarks.GPT-4 outperforms existing large language models on a collection of NLP tasks and exceeds the vast majority of reported state-of-the-art systems" [15] (p.14).In external studies, it was emphasized that "there is significant potential for ChatGPT and GPT-4 to be applied in a range of domains, including education, history, mathematics, physics, and more", and that "the potential of these models to revolutionize natural language processing is enormous" [16] (p.27).Communing with this model, also through its implementation in ChatGPT, shows a spectacularly excellent level of generated texts, as well as the ability to understand and remember the context or give advice and instructions.

Results
The most important premise of the reasoning presented here is the observation that an artificial text is equivalent to a text whose author is a human being.This premise takes on special importance in the context of the role played by the text.A field of study that has developed an advanced reflection on this topic is hermeneutics, which was initially limited to the procedure of interpreting the text.However, with the progress of research, it became clear that the text is a field on which fundamental issues of a cognitive and even existential nature take place.It also turned out to be an important component of social processes.In this situation, each instance of the text becomes an interference in these areas.The reflection that has developed in this direction has led to the observation that the text is, in fact, a condition of these processes rather than their secondary component.
The question then arises about how an artificial text will exist in this context.The main guidelines in this regard are provided by Bleicher, who identifies three main strands of contemporary hermeneutics: hermeneutical theory, hermeneutic philosophy, and critical hermeneutics.On the other hand, they can be understood as extended fields of the text's functioning and, thus, the fields of its impact on reality.Strand one "focuses on the problematic of a general theory of interpretation as the methodology for the human sciences (or Geisteswissenschaften, which include the social sciences)" [17] (p. 1).Thus, it covers the issue of the creation, circulation, and functioning of meaning in the communicative perspective.This perspective is inevitable to expand to the level of knowledge issues.Text "can consequently no longer be the objective re-cognition of the author's intended meaning, but the emergence of practically relevant knowledge in which the subject himself is changed by being made aware of new possibilities of existence and his responsibility for his own future."[17] (p.3).Thus, the text becomes an autonomous agent of a cognitive nature, updating epistemological issues, which results from the nature of this process in which knowledge (meaning) is produced "through the dialogical dialectical mediation of subject and object" [17] (p.3).Strand three comes through the example of Jürgen Habermas's contribution, which "challenges the idealist assumptions underlying both hermeneutical theory and hermeneutic philosophy: the neglect to consider extra-Linguistic factors which also help to constitute the context of thought and action, i.e., work and domination" [17] (p.3).The last strand updates the social context, showing the impact of the text on even the fundamental phenomena of power and domination.
The indicated areas of activity of hermeneutical analysis show the possibilities existing in the natural text, which can be appropriately applied to the artificial text.In this situation, it is necessary to identify new, unknown properties of the latter, unknown from the experience of dealing with the natural text, with a potentially profound impact on both epistemic and social processes.In this context, we can try to collect the changing circumstances of the functioning of language and text and present them in the form of a hypothesis.They are as follows: 1.
Artificial text eliminates symmetrical and equal dialogicality.The circumstances of the creation of an artificial and natural text are completely different; this applies to the process of shaping the argument, its meaning, the choice of means of expression, etc.; 2.
The artificial text changes the communication situation, hitherto based on a dialogic scheme.The communication situation also ceases to be symmetrical; 3.
The artificial text undermines the existing scheme of representing the world in language/text.An artificial text is created as a result of completely different-than-natural creative processes, interpreted in any aspect, i.e., technical, rhetorical, etc.; 4.
An artificial text changes or even excludes the institution of context in many of its aspects, including political and social ones.Natural texts always function in the context of other texts.This continuity is interrupted in the case of an artificial text.
An artificial text undermines or even destroys the poetic function of the text, for example, by limiting the development of semantics beyond the usual patterns and searching for new values of sentences and words in this respect.Natural text can be and usually is a contribution to the creative shaping of language elements.

Discussion
The text phenomenon is the subject of a very advanced, long-term, and extremely rich reflection, which creates an area of research called hermeneutics.These studies justify the unique and autonomous character of each meaningful text, regardless of its origin, which is important from the point of view of artificial texts.A particular development of hermeneutics occurred in the twentieth century.However, the sources of this reflection go back to philosophers such as Protagoras, Plato, Aristotle, and the Stoics.The modern development of hermeneutics began thanks to the Protestant Reformation, which, as Forster and Gjelsdal claim, transferred the activity of interpreting the Bible from the institution, which is the Church, to individual believers and thus democratized and disseminated this skill.It was the 18th-century Protestant theorists who transferred it to a completely secular social context [18] (p.2).In the nineteenth century, Johann Gottfried Herder, Friedrich Schleiermacher, as well as Wilhelm Dilthey, and Friedrich Nietzsche entered this field, significantly expanding the understanding of this context and opening the field for far more advanced hermeneutical reflection, exploding into every field subjected to the act of understanding [17] (p. 1).
An important culmination of the issues of hermeneutics is brought by Hans Georg Gadamer's book from 1960 [19].Gadamer understands the act of understanding as an act of establishing reality and assigns the main role in this process to language and its direct manifestation, which is the text.This sums up his famous phrase: "Being that can be understood is language" [20] (p.470).Embracing the world through the act of understanding it expressed in language can also be interpreted as the production of knowledge about this world.On the other hand, hermeneutical reflection goes beyond the area of epistemological issues.As Vattimo writes: "the question concerning a rationally grounded understanding of texts has progressively tended towards the thinking of a general ontology" [21] (p.721).
Gadamer's reflection is the culmination of a certain stage of understanding hermeneutics and, at the same time, an impulse for a great wave of hermeneutic reflection in the twentieth century, in which outstanding thinkers representing continental Europe appear, such as Paul Derrida, Roland Barth, Michel Foucault, Gilles Deleuze, English analytical philosophy, e.g., Bertrand Russell, Gilbert Ryle, John Longshaw Austin, or America like Richard Rorty and Paul de Man.It also refers to research undertaken within the so-called Vienna circle, with its most famous representative, Ludwik Wittgenstein.

Conclusions
Large language models (LLM) show a growing trend, both in terms of the number of proposed solutions and in terms of size and complexity.
The most advanced solutions in the field of LLM, including GPT-4 by OpenAI, show spectacular possibilities for generating artificial text by providing answers, formulating statements of considerable length, etc.The texts that are the carrier of these products are intelligible, i.e., they have a degree of semantic complexity comparable to, or even equal to, human texts, making them virtually indistinguishable from human texts.
On the other hand, there is a very extensive and old scholarly reflection on the text, contained in the field of hermeneutics, which documents and analyzes the status and role of the text.According to its conclusions, including in particular the conclusions contained in the most influential work in this area, i.e., the book by Hans Georg Gadamer entitled Truth and Method (1960), the text is fundamental to the perception and understanding of the presence of man in the world.In particular, the aforementioned reflection concerns both the metaphysical (epistemological and ontological) foundations of this presence and also the broad social context.It should, therefore, be expected that the emergence of an artificial text, i.e., a text created as a result of processes different than before, significantly different from processes that can be called natural, will affect the circumstances of the text's existence, which are indicated in hermeneutics.It can, therefore, be assumed that artificial texts will lead both to the reconstruction of philosophical reflection devoted to man and social changes of an empirical nature.
The text proposes the first and basic observations regarding these expectations, which can be considered the inauguration of the research area under the name of hermeneutics of artificial text.