Natural Morphological Computation as Foundation of Learning to Learn in Humans, Other Living Organisms, and Intelligent Machines

: The emerging contemporary natural philosophy provides a common ground for the integrative view of the natural, the artiﬁcial, and the human-social knowledge and practices. Learning process is central for acquiring, maintaining, and managing knowledge, both theoretical and practical. This paper explores the relationships between the present advances in understanding of learning in the sciences of the artiﬁcial (deep learning, robotics), natural sciences (neuroscience, cognitive science, biology), and philosophy (philosophy of computing, philosophy of mind, natural philosophy). The question is, what at this stage of the development the inspiration from nature, speciﬁcally its computational models such as info-computation through morphological computing, can contribute to machine learning and artiﬁcial intelligence, and how much on the other hand models and experiments in machine learning and robotics can motivate, justify, and inform research in computational cognitive science, neurosciences, and computing nature. We propose that one contribution can be understanding of the mechanisms of ‘learning to learn’, as a step towards deep learning with symbolic layer of computation / information processing in a framework linking connectionism with symbolism. As all natural systems possessing intelligence are cognitive systems, we describe the evolutionary arguments for the necessity of learning to learn for a system to reach human-level intelligence through evolution and development. The paper thus presents a contribution to the epistemology of the contemporary philosophy of nature. developmental processes, gene regulation networks, gene assembly, protein–protein interaction


Introduction
Artificial intelligence in the form of machine learning is currently making impressive progress, especially in the field of deep learning (DL) [1]. Algorithms in deep learning have been inspired by the human brain, even though our knowledge about brain functions is still incomplete, yet steadily increasing. Learning here is a two-way process where computing is learning from neuroscience, while neuroscience is adopting information processing models, and this process iterates, as discussed in [2][3][4].
Deep learning is based on artificial neural networks resembling neural networks of the brain, processing huge amounts of (labelled) data by high-performance GPUs (graphical processing units) with a parallel architecture. It is (typically supervised) machine learning from examples. It is static, based on the assumption that the world behaves in a similar way and that domain of application is close to the domain where training data are obtained. However impressive and successful, deep-learning

Learning about the World through Agency
The framework for the discussion is the computing nature in the form of info-computationalism. It takes the world (Umwelt) for an agent to be information [15] with its dynamics seen as computation [16]. Information is observer relative and so is computation [17][18][19].
When discussing cognition as an embodied bioinformatic process, we use the notion of agent, i.e., a system able to act on its own behalf, pursuing an intrinsic goal [17,20]. Agency in biological systems, in the sense used here, has been explored in [21,22], where arguments are provided that the world as it appears to an agent depends on the type of agent and the type of interaction through which the agent acquires information about the world [17]. Agents communicate by exchanging messages (information), which helps them to coordinate their actions based on the information they possess and then they share through social cognition.
For something to be information, there must exist an agent for whom that is established as a "difference that makes a difference" [23]. When we argue that the fabric of the world is informational [24], the question can be asked: Who/what is the agent in that context? An agent can be seen as interacting with the points of inhomogeneity (differences that make a difference/data as atoms of information), establishing the connections between those data and the data that constitute the agent itself (a particle, a system). There are myriads of agents for which information of the world makes differences-from elementary particles to molecules, cells, organisms, societies . . . -all of them interact and exchange information on different levels of scale and these information dynamics are natural computation [25,26].
Our definition of agency and cognition as a property of all living organisms, is building on Maturana and Varela [27,28], and Stewart [29]. The question relevant for AI is how artifactual agents should be built in order to possess cognition and eventually even consciousness. Is it possible at all, given that cognition in living organisms is a deeply biologically rooted process? Along with reasoning, language is considered high-level cognitive activity that only humans are capable of. Increasing levels of cognition evolutionary developed in living organisms, starting from basic automatic behaviors such as found in organisms from bacteria [30][31][32][33] to insects, to increasingly complex behavior in complex multicellular life forms such as mammals [34]. Can AI "jump over" evolutionary steps in the development of cognition to reach, and even exceed, human intelligence?
While the idea that cognition is a biological process in all living organisms has been extensively discussed [27][28][29], it is not clear on which basis cognitive processes in all kinds of organisms would be accompanied by (some kind of, some degree of) consciousness. Consciousness is, according to Bengio [7], characteristics for System 2: "We closely associate conscious processing to Kahneman's system 2 cognitive abilities [Kahneman, 2011]." Bengio adopts Baars' global workspace theory of consciousness [35]. In the process of learning, and learning to learn, consciousness plays an important role through the process of attention, which selects only a tiny subset of information/data that is processed, instead of processing indiscriminately huge amounts of data, which is expensive from the point of view of response time and energy [7].
If we, in parallel with "minimal cognition" [36], search for "minimal consciousness" in an organism, what would that be? Opinions are divided at what point in the evolution one can say that consciousness emerged. Some would suggest as Liljenström and Århem that only humans possess consciousness, while the others are ready to recognize consciousness in animals with emotions [37,38]. From the info-computational point of view, it has been argued that cognitive agents with nervous systems are the step in evolution which first enabled consciousness in the sense of internal model with the ability of distinguishing the "self" from the "other" and provide representation of "reality" for an agent based on that distinction [4,39].

Learning in the Computing Nature
For naturalists, nature is the only reality [40]. Nature is described through its structures, processes, and relationships, using a scientific approach [41,42]. Naturalism studies the evolution of the entire natural world, including the life and development of human and humanity as a part of nature. Social and cultural phenomena are studied through their physical manifestations. An example of contemporary naturalist approach is the research field of social cognition with its network-based studies of social behaviors. Already Turing emphasized social character of learning [43], and was also elaborated on by Minsky [44] and Dennett [45].
Computational naturalism (pancomputationalism, naturalist computationalism, computing nature) [46][47][48], see even [3,4], is the view that the entirety of nature is a huge network of computational processes, which, according to physical laws, computes (dynamically develops) its own next state from the current one. Among prominent representatives of this approach are Zuse, Fredkin, Wolfram, Chaitin, and Lloyd, who proposed different varieties of computational naturalism. According to the idea of computing nature, one can view the time development (dynamics) of physical states as information processing (natural computation). Such processes include self-assembly, self-organization, developmental processes, gene regulation networks, gene assembly, protein-protein interaction networks, biological transport networks, social computing, evolution, and similar processes of morphogenesis (creation of form). The idea of computing nature and the relationships between two basic concepts of information and computation are explored in [17][18][19]25].
In the computing nature, cognition is a natural process, seen as a result of natural bio-chemical processes. All living organisms possess some degree of cognition, and for the simplest ones, like bacteria, cognition consists in metabolism and (my addition) locomotion [17]. This "degree" is not meant as continuous function, but as a qualitative characterization that cognitive capacities increase from simplest to the most complex organisms. The process of interaction with the environment causes changes in the informational structures that correspond to the body of an agent and its control mechanisms, which define its future interactions with the world and its inner information processing [49]. Informational structures of an agent become semantic information (i.e., get explicit metacognitive meaning through System 2, which generates metacognition for an agent) first in the case of highly intelligent agents capable of reasoning, which we know some birds are.
Recently, empirical studies have revealed an unexpected richness of cognitive behaviors (perception, information processing, memory, decision making) in organisms as simple as bacteria. [30][31][32][33]. Single bacteria are small, typically 0.5-5.0 micrometers in length, and interact with only their immediate environment. They live too short as a specific single organism to be able to memorize a significant amount of data. Biologically, bacteria are immortal at the level of the colony, as the two daughter bacteria from cell division of a parent bacterium are considered as two new individuals. Thus bacterial colonies, swarms, and films that extend to a bigger space, and can survive longer time, have longer memory, and exhibit an unanticipated complexity of behaviors that can undoubtedly be characterized as cognition [50,51], see even [45]. More fascinating cases are even simpler agents like viruses, on the border of the living, which are based on the simple principle that the most viable versions persist and multiply while others vanish [52,53]. Memory and learning are the key competences of living organisms [50], and in the simplest case, memory is based on the change of shape [54], which appears on different scales and levels of organization [55]. Fields and Levin add evolutionary perspective to the memory characterization and argue that "genome is only one of several multi-generational biological memories". Additionally, cytoplasm and cell membrane, which characterize all of life on evolutionary timescale, preserve memory [56]. Because of complex structure of the cell, biological memory cannot be understood at one particular scale, and information is propagated and preserved in non-genomic cellular structures, which changes current understanding of biological memory [55,56]. It also forms at different time scales [57].
Starting with bacteria and archaea [58], all organisms without nervous systems cognize, that is perceive their environment, process information, learn, memorize, and communicate. As they are natural information processors, some such as slime mold, multinucleate, or multicellular Amoebozoa, has been used as natural computer/information processors to compute shortest paths. Even plants cognize, in spite of being often thought of as living systems without cognitive capacities [59]. Plants have been found to possess memory (in their bodily structures that change as a result of past events), the ability to learn (plasticity, ability to adapt through morphodynamics), and the capacity to anticipate and direct their behavior accordingly. Plants are also argued to possess rudimentary forms of knowledge, according to [60] (p. 121), [61] (p. 7), and [34] (p. 61).
Consequently, in this article we take basic cognition to be the totality of processes of self-generation/self-organization, self-regulation, and self-maintenance that enables organisms to survive processing information from the environment. The understanding of cognition as it appears in degrees of complexity in living nature can help us better understand the step between inanimate and animate matter from the first autocatalytic chemical reactions to the first autopoietic proto-cells, as well as evolution of life and learning.

Learning in the Evolutionary Perspective
A recent trend in the design is learning from nature, biomimetics. Deep learning is one of the technologies developed within the biomimetic paradigm. In the case of intelligence, we still have a lot to learn from nature about how our own brains, intelligence, and learning function. One of the strategies is to start with learning in simplest organisms, in order to uncover basic mechanisms of the process. Evolution can be seen as a process of problem-solving [34]. "From the amoeba to Einstein, the growth of knowledge is always the same: We try to solve our problems, and to obtain, by a process of elimination something approaching adequacy in our tentative solutions" [62] (p. 261). All acquired knowledge-whether it is acquired in the process of genetic evolution or in the process of individual learning-consists, this is Popper's central claim, in the modification "of some form of knowledge, or disposition, which was there previously, and in the last instance of inborn expectations" [62] (p. 71).
Popper's theory of the growth of knowledge through trial-and-error conjecture-based problem-solving shares basic approach with evolutionary epistemology. According to Campbell [63], all knowledge processes can be seen as the "variation and selective retention process of evolutionary adaptation" [64]. Thagard [65] criticizes Popper, Campbell, Toulmin, and others who proposed Darwinian models of the growth of (scientific) knowledge. Evolutionary epistemology emphasizes analogy between the development of biological species and scientific knowledge, based on variation, selection, and transmission. Thagard, on the other hand, holds that differences are more important than similarities, and that scientific knowledge is guided by "intentional, abductive theory construction in scientific discovery, the selection of theories according to general criteria, the achievement of progress by sustained application of criteria, and the transmission of selected theories in highly organized scientific communities". Even though scientific knowledge is a specific, formal kind of knowledge, it is nevertheless knowledge.
This criticism of evolutionary epistemology is addressing a specific understanding of evolution, through Darwinism in the narrow sense. However, the contemporary extended evolutionary synthesis provides mechanisms beyond blind variation of narrow Darwinism, and can accommodate for learning, anticipation, and intentionality [66][67][68][69]. In a similar, broader evolutionary approach, Watson and Szathmáry ask "Can evolution learn?" in [70], and suggest that "evolution can learn in more sophisticated ways than previously realized". Here "A system exhibits learning if its performance at some task improves with experience". They propose new theoretical approaches to the evolution of evolvability, and the evolution of ecological organizations, among others. They refer to Turing, who made an algorithmic model of computation (Turing machine) and established the connection between learning and intelligence through an algorithmic approach [71]. The relationship between learning and evolution is established through the notion of reinforcement learning, as "reusing behaviors that have been successful in the past (reinforcement learning) is intuitively similar to the way selection increases the proportion of fit phenotypes in a population". Watson and Szathmáry's paper list number of different types of learning, including diverse machine learning approaches, ended with the claim that there is a clear analogy between evolution and the process of learning, and that we can better understand evolution if we see it as learning.
In spite of mentioning Turing's pioneering work on the topic of algorithmic learning, Watson and Szathmáry assume "incremental adaptation (e.g., from positive and/or negative reinforcement)".
Critics of the evolutionary approach argue for the impossibility of such incremental process to produce highly complex structures such as intelligent living organisms. Monkeys typing Shakespeare are often used as illustration. As an counterargument, Chaitin [72] pointed out that typing monkeys' argument does not take into account physical laws of the universe, which dramatically limit what can be typed. Moreover, the universe is not a typewriter, but a computer, so a monkey types random input into a computer. The computer interprets the strings as programs. Or, in the words of Gershenfeld: "Your genome doesn't store anywhere that you have five fingers. It stores a developmental program, and when you run it, you get five fingers" [73].
Sloman argued that "many of the developments in biological evolution that are so far not understood, and in some cases have gone unnoticed, were concerned with changes in information processing. The same is true of changes in individual development and learning: They often produce new forms of information processing". He addressed this phenomenon through computational ideas about morphogenesis and meta-morphogenesis [74]. His approach offers new insight, that variation is algorithmic. To Sloman's computational approach, I would add that steps in variation are morphological computation, which means physical computation, capable of randomly modifying genes, and executing morphological programs which do not present smooth incremental changes, but considerable jumps in properties of structures and processes. Morphological computation acts also through gene regulation, which is one more process that was unknown to both Darwin and to proponents of evolution as Modern Synthesis. Originally, genes were considered as coding for specific proteins, and it was believed that all genes were active. Gene regulation involves a mechanism that can repress or induce the expression of a gene. According to Nature [75], "These include structural and chemical changes to the genetic material, binding of proteins to specific DNA elements to regulate transcription, or mechanisms that modulate translation of mRNA".

Learning as Computation in Networks of Agents
In what follows, we will focus on info-computational framework of learning. Informational structures constituting the fabric of physical nature are networks of networks, which represent semantic relations between data for an agent [18]. Information is organized in levels or layers, from quantum level to atomic, molecular, cellular, organismic, social, and so on. Computation/information processing involves data structure exchanges within informational networks, which are instructively represented by Carl Hewitt's actor model [76]. Different types of computation emerge at different levels of organization in nature as exchanges of informational structures between the nodes (computational agents) in the network [17].
The research in computing nature/natural computing is characterized by bi-directional knowledge exchanges, through the interactions between computing and natural sciences [54]. While natural sciences are adopting tools, methodologies, and ideas of information processing, computing is broadening the notion of computation, taking information processing found in nature as computation [2,77]. Based on that, Denning argues that computing today is a natural science, the fourth great domain of science [78,79]. Computation found in nature is a physical process, where nature computes with physical bodies as objects. Physical laws govern processes of computation, which appear on many different levels of organization in nature.
With its layered computational architecture, natural computation provides a basis for a unified understanding of phenomena of embodied cognition, intelligence, and learning (knowledge generation), including meta-learning (learning to learn) [47,80]. Natural computation can be modelled as a process of exchange of information in a network of informational agents [76], i.e., entities capable of acting on their own behalf, which is Hewitt's actor model applied to natural agents.
One sort of computation is found on the quantum-mechanical level, where agents are elementary particles, and messages (information carriers) are exchanged by force carriers, while different types of computation can be found on other levels of organization in nature. In biology, information processing is going on in cells, tissues, organs, organisms, and eco-systems, with corresponding agents and message types. In biological computing, the message carriers are chunks of information such as molecules, while in social computing, they are sentences while the computational nodes (agents) are molecules, cells, and organisms in biological computing or groups/societies in social computing [19].

Info-Computational Learning by Morphological Computation
The notion of computation in this framework refers to the most general concept of intrinsic computation, that is spontaneous computation processes in the nature [2,77], and which is used as a basis of designed computation found in computing machinery [81]. Intrinsic natural computation includes quantum computation [81,82], processes of self-organization, self-assembly, developmental processes, gene regulation networks, gene assembly, protein-protein interaction networks, biological transport networks, and similar. It is both analog (such as found in dynamic systems) and digital. The majority of info-computational processes are sub-symbolic and some of them are symbolic (like reasoning and languages).
Within info-computational framework, or computing nature [18], computation on a given level of organization of information presents a realization/actualization of the laws that govern interactions between its constituent parts. On the basic level, computation is manifestation of causation in the physical substrate [83]. In every next layer of organization, a set of rules governing the system switch to the new emergent regime. It remains yet to be established how this process exactly goes on in nature, and how emergent properties occur [84]. Research on natural computing is expected to uncover those mechanisms. In the words of Rozenberg and Kari: "(O)ur task is nothing less than to discover a new, broader, notion of computation, and to understand the world around us in terms of information processing" [2]. From the research in complex dynamical systems, biology, neuroscience, cognitive science, networks, concurrency etc., new insights essential for the info-computational nature are steadily coming. Here it should be mentioned that the computing nature with "bold" physical computation [85] is the maximal physicalist approach to computing. There are less radical approaches, such as taken by Horsman, Stepney, and co-authors [86][87][88], known as Abstraction/Representation theory (AR theory), where "physical computing is the use of a physical system to predict the outcome of an abstract evolution", where computation defines the relationship between physical systems and abstract concepts/representations. Unlike AR theory, info-computationalism also embraces computation without representation, in the sense of Brooks [89] or Pfeifer [90]. Even though it is already established that the original Turing model of computation is specific and represents a human performing calculation, as pointed out by Copeland [91], even Turing himself started exploring computation beyond the Turing Machine model.
Turing's 1952 paper [92] may be considered as a predecessor of natural computing. It addressed the process of morphogenesis by proposing a chemical model as the explanation of the development of biological patterns such as the spots and stripes on animal skin. Turing did not claim that a physical system producing patterns actually performed computation. From the perspective of computing nature, we can now argue that morphogenesis is a process of morphological computation. Informational structure (as representation of embodied physical structure) presents a program that governs computational process [93], which in its turn changes that original informational structure following/implementing/realizing physical laws.
Morphology is the central idea in our understanding of the connection between computation and information. Morphological/morphogenetic computing on that informational structure leads to new informational structures via processes of self-organization of information. Evolution itself is a process of morphological computation on a long-term scale. It is also important to take into account the second order process of morphogenesis of morphogenesis (meta-morphogenesis) as done by Sloman [74].
A closely related idea to natural computing is Valiant's [94] view of evolution by "ecorithms"-learning algorithms that perform "probably approximately correct" (PAC) computation. Unlike the classical model of Turing machine, the "ecorithmic" computation does not give perfect results, but good enough (for an agent). That is the case for natural computing in biological agents who always act under resource constraints, especially time, energy, and material limitations, unlike Turing machine model of computation that by definition operates with unlimited resources. An older term for PAC due to Simon is "satisficing" [95] (p. 129): "Evidently, organisms adapt well enough to 'satisfice'; they do not, in general, 'optimize'".

Learning to Learn from Raw Data and up-Agency from System 1 to System 2
Cognition is a result of a processes of morphological computation on informational structures of a cognitive agent in the interaction with the physical world, with processes going on at both sub-symbolic and symbolic levels [4]. This morphological computation establishes connections between an agent's body, its nervous system (control), and its environment [49]. Through the embodied interaction with the informational structures of the environment, via sensory-motor coordination, information structures are induced (stimulated, produced) in the sensory data of a cognitive agent, thus establishing perception, categorization, and learning. Those processes result in constant updates of memory and other structures that support behavior, particularly anticipation. Embodied and corresponding induced informational structures (in the Sloman's sense of virtual machine) [96] are the basis of all cognitive activities, including consciousness and language as a means of maintenance of "reality" or the representation of the world in the agent.
From the simplest cognizing agents such as bacteria to the complex biological organisms with nervous systems and brains, the basic informational structures undergo transformations through morphological computation as developmental and evolutionary form-generation-morphogenesis. Living organisms as complex agents inherit bodily structures resulting from a long evolutionary development of species. Those structures are the embodied memory of the evolutionary past [54]. They present the means for agents to interact with the world, get new information that induces embodied memories, learn new patterns of behavior, and learn/construct new knowledge. By Hebbian learning in the brain (where neurons that wire together, fire together, and habits increase probability of firing), world shapes humans' (or an animals') informational structures. Neural networks that "self-organize stable pattern recognition code in real-time in response to arbitrary sequences of input patterns" are an illustrative example [97].
On the fundamental level of quantum mechanical substrate, information processes represent actions of laws of physics. Physicists are already working on reformulating physics in terms of information [98][99][100][101][102][103]. This development can be related to the Wheeler's idea "it from bit" [104] and von Weizsäcker's ur-alternatives [105].
In the computing nature approach, nature is consisting of physical structures that form levels of organization, on which computation processes develop. It has been argued that on the lower levels of organization, finite automata or Turing machines might be an adequate model of computation, while in the case of human cognition on the level of the whole-brain, non-Turing computation is necessary, see Ehresmann [106] and Ghosh et al. [107]. Symbols on the higher levels of abstraction (System 2) are related with several possible sub-symbolic realizations, which they generalize, as Ehresmann's models show. Zenil et al.'s work on causality by algorithmic generative models to "decompose an observation into its most likely algorithmic generative models" [108] presents one of the recent attempts to computationally/algorithmically approach causality. Algorithmic computation is a very important part of computational models defined by Turing, based on symbol manipulation. The connection to sub-symbolic is done through algorithmic information theory.
Apart from the Handbook of Natural Computing [77] that presents concrete models of natural computation, interesting work on computational modelling of biochemistry and reaction networks have been done by Cardelli [109][110][111][112], including the study of morphisms of reaction networks that link structure to function. On the side of cognitive computing, Fresco addresses the physical computation and its role in cognition [113].
Principles of morphological computing and data self-organization from biology have been applied in robotics as well. In recent years, morphological computing emerged as a new idea in robotics, see [3,4] and references therein. Initially, robotics treated separately the body as a machine, and its control as a program. Meanwhile it has become apparent that embodiment itself is fundamental for cognition, generation of behavior, intelligence, and learning. Embodiment is central because cognition arises from the interaction of brain, body, and environment [90]. Agents' behavior develops through embodied interaction with the environment, in particular through sensory-motor coordination, when information structure is induced in the sensory data, thus leading to perception, learning, and categorization [48]. Morphological computing has also been applied in soft robotics, self-assembly systems, and molecular robotics, embodied robotics, and more. Even though the use of morphological computing in robotics is slightly different from the one in computing nature, there are common grounds and possibilities to learn from each other on the multidisciplinary level. Similar goes for the research being done in the fields of cognitive informatics and cognitive computing. There are also important connections to computational mechanics, algorithmic information dynamics (probabilistic framework of algorithmic information dynamics used for causal analysis), and neuro-symbolic computation, combining symbolic and neural processing, all of which are in different ways relevant to the topic. Those connections remain to explore in the future work.

Conclusions and Future Work
The info-computational approach, developed by the author, with natural morphological computation as a basis, is used to approach learning and learning to learn in humans, other living organisms, and intelligent machines. This paper is a contribution to epistemology of the philosophy of nature, proposing a new perspective on the learning process, both in artificial information processing systems such as robots and AI systems, and in natural information processing systems like living organisms.
Morphological computation is proposed as a mechanism of learning and meta-learning, necessary for connecting the pre-symbolic (pre-conscious) with the symbolic (conscious) information processing. In the framework of info-computational nature, morphological computation is information (re)structuring through computational processes which follow (implement) physical laws. It is grounded in the notion of agency, with causality represented by morphological computation.
Morphology is the central idea in understanding of the connection between computation (morphological/morphogenetical) and information. Morphology refers to form, shape, and structure. Materials represent morphology on the underlying level of organization. For the arrangements of molecular and atomic structures, material are protons, neutrons, and electrons on the level below.
Morphological computation, represented as information communication between agents/actors of the Hewitt actor model, distributed in space, where computational devices communicate asynchronously and the entire computation is generally not in any well-defined state [3]. Unlike Turing computation, which is a mathematical-logical model, Hewitt computation is a physical model. For morphological computing as information (re)structuring through computational processes which follow (implement) physical laws, Hewitt computation provides consequent formalization. On the basic level, morphological computation is natural computation in which physical objects perform computation. Symbol manipulation in this case is physical object manipulation, in the sense of Brooks "the world is its own best model". It becomes relevant in robotics and deep learning that manage direct behavior of an agent in the physical world.
In morphological computation, cognition is the restructuring of an agent through the interaction with the world, so all living organisms possess some degree of cognition. As a result of evolution, increasingly complex living organisms arise from the simple ones, that are able to survive and adapt to their environment. It means they are able to register inputs (data) from the environment, to structure those into information, and in more developed organisms into knowledge. The evolutionary advantage of using structured, component-based approaches is improving response-time and efficiency of cognitive processes of an organism, which drives the development from organisms with learning on the System 1 level, to the ones that acquire System 2 capabilities on top of it. In more complex cognitive agents, knowledge is built upon not only reaction to input information, but also on internal information processing with intentional choices, dependent on value systems stored and organized in agents' memory.
Knowledge generation places information and computation (communication) in focus, as information and its processing are essential structural and dynamic elements which characterize structuring of input data (data → information → knowledge → metaknowledge) by an interactive computational process going on in the agent during the adaptive interplay with the environment.
In nature, through the process of evolution and development, living systems learn to survive and thrive in their environment. Interactions present forms of reinforcement learning or Hebbian learning that make previous successful strategies preferred in the future [70]. That happens on a variety of levels of organization. On the meta-level, the meta-morphological computing (as a Sloman's virtual machine) [96] governs learning to learn.
In the case of human learning, the brain as a network of computational agents processes information obtained through the embodied communication with the environment as well as internal information from the body. Consciousness is a process of integration of information in the brain [35], and it gets a huge amount of data/information that would be overwhelming for the brain to handle in real time, so it uses the mechanism of attention to focus on a specific subset of information, typically regarding agent-based processes in the world. There changes in the scene are the consequence of the agent's interactions, and they are the unfolding of physical processes of morphological computations. Causality, or rather stable correlations between structures and processes in the world (from an agent's perspective) follow from what humans learn/memorize, as they get organized internally through the Hebbian principles where neurons that fire together, wire together.
Sloman, who developed theory of Meta-morphogenesis [74], started with the idea that changes in individual development and learning of an agent produce new forms of information processing [74]. His approach offers new insight, that variation is algorithmic. The interplay between structure and process is essential for learning, as past experiences stored in structures affect the possibility of future processes and strategies of learning and learning to learn. To Sloman's morphogenetic approach, I would propose to add that steps in variation are results of morphological computation, which means physical computation, capable of, e.g., modifying genes, and executing morphological programs which do not present smooth incremental changes, but jumps in properties of structures and processes. Morphological computation acts also through gene regulation, which is one more process that was unknown to both Darwin and to proponents of evolution as Modern Synthesis.
Since contemporary deep-learning-centered AI (dealing with human-level cognition and above) is gradually developing from the present state System 1 (connectionist, sub-symbolic) coverage towards the System 2 (symbolic), with agency, causality, consciousness, and attention as mechanisms of learning and meta-learning [5,114], it searches for mechanisms of transition between two systems. An inspiration for technology development, the human brain is of interest as the center of learning in humans, that is self-organized, resilient, fault tolerant, plastic, computationally powerful, and energetically efficient. In its development, like in the past, deep learning is inspired by nature, assimilating ideas from neuroscience, cognitive science, biology, and more. The AI approach to understanding, via decomposition and construction, is close to other computational models of nature in that it seeks testable and applicable models, based on data and information processing. Bengio's proposal of agent-based perspective [5], necessary to proceed from System 1 to System 2 learning, can be related to the model of learning based on morphological computing.
For the future, more interdisciplinary/crossdisciplinary/transdisciplinary work remains to be done as a way to increase understanding of connections between the low level and the high level cognitive processes, learning, and meta-learning. It will also be instructive to find relations between (levels of/degrees of) cognition and consciousness as mechanisms helping to reduce the number of variables that are manipulated by an agent for the purpose of perception, reasoning, decision-making, planning acting/agency, and learning.
The goals of artificial intelligence, as well as robotics, differ from those of the computing nature and morphological computing. AI builds solutions for practical problems and in that it typically focuses on the highest possible level of intelligence, even though among the AI fields inspired by computing nature, there is developmental robotics, which has more explorative character.
The priority of info-computational naturalism is understanding and connecting knowledge about nature, while a lot of current technology is searching for inspiration in nature in pursuit of new technological solutions. Paths of the two are meeting, and mutual exchange of ideas is beneficial for both sides. Specialist sciences and philosophies also need close communication and exchange of ideas. Learning and meta-learning within computing nature is such a topic of central importance that calls for more knowledge from a variety of fields. This paper is not only the presentation of how much that is already known, but also an attempt to indicate how much more remains to be done.
Funding: This research is funded by Swedish Research Council, VR grant MORCOM@COG.