Towards a Truly Pragmatic Concept of Knowledges †

: The article describes the evolution of the understanding and application of the concept of knowledge in the 20th century. It includes important facts, such as the epistemological turn in mathematics, Popper’s proposal, Gettier’s article, the emergence of the idea of social contexts of knowledge, the instrumentalization of knowledge e.g., in management, and a new knowledge situation in the context of artiﬁcial intelligence. These facts lead to the idea of knowledges which replaces the traditional, platonic idea of one true knowledge that belongs exclusively to man. Models Are Unsupervised Multitask Learners.


Introduction
This text shows the evolution of the understanding of knowledge today. Understanding knowledge is also knowledge about knowledge; hence changes in this area affect very basic human epistemic predispositions. The evolution is a transition from knowledge as a certain state that is unique to man to a state that can be called the dispersed disposition of knowledge. At the same time, one of the most important properties of knowledge, which is its truth, is being devalued. Knowledge is initially understood here as propositional knowledge (subject to articulation), and thus it is treated as a specific social phenomenon and, by necessity, also historical. This type of approach to knowledge developed in the twentieth century, especially in its second half, thus the concept of knowledge began to be the subject of a very rich analytical effort described, for example, by Burgin [1].
Knowledge has been the subject of philosophy since the beginning of European culture, although a separate branch of philosophy called epistemology, understood as a self-conscious research procedure, is much younger. The concept of epistemology did not appear until the 18th century. The most important questions that arise within it are given by Wolenski: "What is knowledge?; Is knowledge based on senses or reason? Is certainty attainable? What is truth? Are there ultimate limits of knowledge?" [2] (p. 4). Chisholm asks analogous questions somewhat differently: "What can I know? How can I distinguish those things I am justified in believing from those things I am not justified in believing? And how can I decide whether I am more justified in believing one thing than in believing another?" [3] (p. 1). Not only are they based on the key role of man, they are even questions "about ourselves".
It should be emphasized that epistemology is capable of talking about knowledge as an independent subject, not dispersed among various research contexts. This approach was initiated by the fundamental understanding of knowledge provided by Plato. The thread of knowledge appears in various of his writings, such as Phaedo, Symposium, Republic, Timaeus, Sophist, and Statesman [4]. The most famous and still important definition of knowledge appears in Theaetetus: Dóksa alethés metá lógu. The English version that dominates the literature is as follows: "true belief accompanied by a rational account" [5] (p. 115). It is usually abbreviated to the three letters JTB, which stands for Justified True Belief. That interpretation "is a central philosophical claim [of knowledge] of the Western tradition since Plato" [6] (p. 43). For Plato, knowledge was an exclusively human disposition, and its status remained unchanged for the next over two thousand years. This assumption results from the use of the notion of dóksa. Knowledge according to Plato is also necessarily true (alethés). These two features of knowledge, however, were challenged in the reconstruction process that took place in the 20th century.

Results
The 20th century was the period when the understanding of knowledge dissipated. The term came to be used widely in hitherto unprecedented circumstances such as politics, social life, management, etc. This dissipation, however, was the result of a deeper process dealing with the key knowledge questions. This process concerned man's ability to access the world, and therefore his completely basic epistemic competencies. The reflection on this problem also goes back to antiquity, but its well-known, rationalist interpretation was given by Descartes, who formulated the dualism of matter (res extensa) and mind (res cogitans) [7]. This dramatic contrast became a model for understanding the mutual position of man and the world over the next 300 years, also building a context for understanding knowledge. This knowledge, which is the property of the mind, presupposes the existence of matter as an objective entity that can be known by reason.
This paper tries to show that such an image of knowledge has been replaced with the image of multiple and distributed knowledges existing in various variants. This is the result of a process in which knowledge, on the one hand, lost its anchor in the form of a human being as its sole disposer, gradually becoming an autonomous phenomenon. On the other hand, however, due to the breakdown of faith in the simple possibility of representing the world, such knowledge was deprived of the requirement of truthfulness. The result of such a situation is the multiplication of many numerous and contingent knowledges in place of one true knowledge.
This kind of transformation is undoubtedly the result of digital transformation, understood not as the domination and influence of digital technologies, but as a process of changing the cognitive paradigm that took place at the end of the 19th century in mathematics and geometry. On the other hand, digital technologies, especially the most advanced, such as the so-called artificial intelligence, and within it NLP, require a concept of knowledge detached from a human and instrumentalized and adapted to the purposeful and utilitarian requirements of computing systems, performing clearly defined tasks, e.g., translation, question answering, reading comprehension, cloze, and completion, etc.

Discussion
The idea of change described here was long-lasting and lasted from the end of the 19th century to the end of the 20th century, and its effects intensified rather than faded away. To illustrate the phases of this change, the following facts can be indicated, which will then be briefly discussed: (1) the aforementioned cognitive paradigm shift taking place in the second half of the 19th century and at the beginning of the 20th century; (2) Popper's concept which practically excludes the possibility of expressing constructive and certain judgments about the world; (3) the so-called Gettier's problem showing an error in Plato's definition of knowledge [8]; (4) a great project to relativize human cognition and knowledge with respect to the social context, with many variants, developed in the 20th century; (5) pragmatic understanding of knowledge as a resource that can shape large-scale social phenomena (e.g., knowledge society), perform the function of an organizational resource in economic processes (e.g., knowledge management) or generate a human functioning environment (e.g., knowledge organization); and (6) the emergence of knowledge as a resource underpinning artificial cognitive systems of varying degrees of sophistication, also developed as the latest language models in the field of NLP. Each of these facts has far-reaching, briefly recalled consequences.
The change of the cognitive paradigm in the 19th century led to the emergence of the concept of building formal models as completely arbitrary structures that don't need to represent the world, but only to keep the internal coherence based on assumed inferences.
Kline describes this state as follows: "The two-thousand-year-old conviction that mathematics was the truth about nature was shattered. But the mathematical theories now recognized to be arbitrary had nevertheless proved useful in the study of nature. ( . . . ) Mathematicians then should feel free to create arbitrary structures" [9] (p. 1036). In practice, they appear as the axiomatic systems described by Peano [10] and Hilbert [11]. They are the result of the so-called non-Euclidean geometries, i.e., geometries detached from experience and based on free, though, disciplined and coherent speculation. Such a strategy not only does not prevent their application to the physical world but allows them to overcome the disadvantages of human observation resulting from imperfect tools of cognition. The reversed direction of building models of the world causes a deep crisis of trust in various forms of perceiving the world. New models are emerging, such as the network model, which is the universal prototype of complexity, and systems theory. This upheaval also appears in the field of the humanities, resulting in so-called poststructuralism and postmodernism.
The concept of falsificationism presented by Karl Popper in 1935 [12] can be considered as a consequence of the aforementioned crisis of trust. A scientific discovery according to Popper is not limited in the process of its creation. It is the result of freedom of interpretation, which must be subject to strict rules of logic only in further proceedings. Popper wrote directly, answering the question about the source of the theory: "what you will" [13] (p. 9). He pointed out also that there is no certainty of a theory that can be refuted by subsequent discoveries. Certainty is provided only by the non-constructive falsification procedure, i.e., proving falsehood. This deep cognitive pessimism, combined with the removal of the absolute requirement to represent the world from the beginning of the research procedure, including the experimental one, raises doubts about the credibility of scientific knowledge, returning to the basic epistemological questions mentioned at the beginning.
Gettier played the role of a child who shouted that the king was naked, causing, with this rather simple gesture, a deep crisis in the current order, this time related to the definition of Plato's knowledge that was considered accurate and valid. Gettier's very short text consists mainly of two examples that prove that one can have justified and truthful beliefs, and thus fulfill Plato's conditions and not have knowledge. These examples were supported by Sober, as well as by Russell and Meinong, who previously pointed to similar situations. Gettier's article provoked a heated discussion and many proposals to correct the definition of Plato [14]. However, it was also a source of deep skepticism and a significant reformulation of the human situation in the world.
Gettier's approach summarized by Sober [15] also gave an introduction to the process of reconstructing the understanding of knowledge as a social phenomenon, which increasingly emphasizes the connection of man with social and historical circumstances of life, shaping his basic competences, including cognitive competences. The observations and theories, which appeared more and more in the twentieth century, ultimately formed a great project, analyzing in detail both these circumstances and competencies. Adolf and Stehr mention such great classics as Max Weber, Max Scheler, Karl Marx, Karl Mannheim, Georg Simmel, and Emil Durkheim as researchers who opened this perspective [16]. Adolf and Stehr consider Karl Mannheim to be the founder of the sociology of knowledge, who associated it with immediate historical conditions [17]. A developed and self-aware sociology of knowledge emerged thanks to David Bloor, the creator of the so-called strong program-an idea linking knowledge with the social context. In 1976, Bloor wrote, clearly arguing with Plato: "Instead of defining it [knowledge] as true belief or perhaps, justified true belief-knowledge for the sociologist is whatever people take to be knowledge. It consists of those beliefs which people confidently hold to and live by" [18] (1976) (p. 5).
Thinking about knowledge also necessarily refers to the practical circumstances of its designed emergence, to the laboratory. Researchers such as Robert Merton, Steve Woolgar and Bruno Latour, or Karina Knorr-Cetina analyzed it by polemicizing with the naive notion of "objective" knowledge. Latour extended these reflections into a powerful theory in which he transcended the boundaries of traditional sociology, re-formulating the abovementioned Cartesian dichotomy on new principles. Thus, it built a new ontology in which nature is equated with culture in the processes of mutual, active interactions on both sides. The complex web of these interactions is what Latour called social and is the place where determined and local knowledge is always produced [19].
Such an approach was initiated by the concept of Ludwik Fleck [20], who Latour considered the father of the sociology of knowledge. Bringing to the end also means self-referential and, consequently, meta-level reflection, which directly implements the assumptions of the cognitive upheaval of mathematics. Fleck also inspired another branch of the interpretation of the social context of science, represented by Thomas Kuhn, Imre Lakatos, and Paul Feyerabend. They rejected in various ways the cumulative development of knowledge based on the cause-effect order, pointing to its discontinuous or anarchic character.
By necessity, social issues also include political phenomena, that is, those concerning the distribution of power, political assumptions, and the organization of societies. This kind of thought came relatively early and appeared as a project of the so-called knowledge society in which knowledge is a source of competitive advantage and a fundamental good. Thus, it became reified and instrumentalized. The reflection of such researchers as Peter Drucker, Fritz Machlup, Herbert Simon, or David Bell combined philosophical, historical, social, and economic considerations.
This kind of approach was also represented by representatives of French poststructuralism, such as Jean-François Lyotard and Michel Foucault, who interpreted the social nature of knowledge through language which they considered to be a key carrier of knowledge. The association of knowledge with language is, of course, much older, but in the 20th century, it produced, mainly due to Ludwig Wittgenstein, the English analytical philosophy and hermeneutics, a specific and very strong trend in the interpretation of knowledge. Its characteristic feature is the dependence of knowledge on language games which are local in the temporal and spatial sense and, thus, lead to the dissociation and relativization of the phenomenon of knowledge.
In the 1990s, the pragmatic approach to knowledge took on a direct character, leading to Bell's idea of objective implementation in the form of so-called knowledge management (KM). Knowledge became an organizational resource, similar to other resources; it became an object of exploitation and a component of production processes. Dalikir describes this approach: "knowledge management represents a deliberate and systematic approach to ensure the full utilization of the organization's knowledge base, coupled with the potential of individual skills, competencies, thoughts, innovations, and ideas to create a more efficient and effective organization" [21] (p. 2).
However, perhaps the most extreme pragmatic approach was implemented in the case of understanding knowledge that appeared in the field of computer science and specific technologies in this area, and above all in NLP. Knowledge is the subject of formalization, in which various logical and mathematical tools are used that allow for the representation of knowledge that can be used in computational processes e.g., [22,23]. A separate approach to knowledge appears in particular in the area of so-called artificial intelligence e.g., [24,25], and within it in NLP technology e.g., [26]. The latter field, especially, provides spectacular examples of the manifestation of knowledge in the GPT 2 and GPT 3 models [27,28]. In this case, knowledge has a distributed character and is implicit and hidden, i.e., non-symbolic. Knowledge as a result of the development of digital technologies also appears as a result of gaining access to massive collections of data and texts, e.g., the web, where it is explored using techniques called mining [29,30].

Conclusions
The phenomenon of knowledges (plural) can be observed as a social and technological phenomenon of the formation of local knowledge centers, which was predicted in the trend based on the social construction of knowledge but emerged truly as an effect of digital development.