This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

Processes considered rendering information dynamics have been studied, among others in: questions and answers, observations, communication, learning, belief revision, logical inference, game-theoretic interactions and computation. This article will put the computational approaches into a broader context of natural computation, where information dynamics is not only found in human communication and computational machinery but also in the entire nature. Information is understood as representing the world (reality as an informational web) for a cognizing agent, while information dynamics (information processing, computation) realizes physical laws through which all the changes of informational structures unfold. Computation as it appears in the natural world is more general than the human process of calculation modeled by the Turing machine. Natural computing is epitomized through the interactions of concurrent, in general asynchronous computational processes which are adequately represented by what Abramsky names “the second generation models of computation” [

This paper has several aims. Firstly, it offers a computational interpretation of information dynamics of the info-computational universe [

Both information and computation are concepts and phenomena still intensely researched from the multitude of perspectives. In order to be able to meaningfully model the universe as a computational network, the model of computation must be adequate. When it comes to generalization of the idea of computation, there are several confusions. The recurrent one is about the relationship between the universe and the computer. It must be emphasized that the universe is not equivalent to a PC in any interesting way and when we talk about computing universe, the notion of computation must be generalized to be able to reflect the richness of phenomena found in nature. The attempt to represent vastly complex systems comprising a huge number of different organizational levels by a Turing machine model is inadequate, and indeed presents too powerful a metaphor, as already noticed by critics [

Van Benthem and Adriaans [

The early developments of the field of dynamic of information such as seminal work of Dretske [

Depending on the understanding of information, different types of information dynamics are defined by Floridi [

The first two types of the dynamics of information refer to the classical notion of information as constituent of knowledge. Information processing is a more general type of dynamics and can be applied to any type of information.

A study of information dynamics within a framework of logic is presented in van Benthem's recent book [

Informational dynamics is by Hofkirchner [

Info-computationalism (ICON) [

Info-computationalism is a unifying approach that brings together

In sum: if the physical universe is an informational structure, natural computation is a process governing the dynamics of information. Information and computation are two mutually defining concepts [

Information dynamics expressed as computation is related to information by Burgin in the article

Information is fundamental as a basis for all knowledge and its processing lies behind all our cognitive functions. In a wider sense of proto-information, information represents every physical/material phenomenon, [

Instructive in this context is connecting the information flow in computation, interactive process and games (representing the rules or logic) as steps towards a “fully-fledged dynamical theory” [

The claim that “information

Abramski [

From everyday experiences we know that computation can provide information. Some typical examples are search functions and automatic translation. On the web, a search engine finds by computation the information we are interested in. Wolfram's Alpha computes even further by filling in the gaps in the information that exists on the web. Or, in short, as [

In agent-based models, which are a class of computational models for simulating the behavior of autonomous agents, not only the notion of an agent but also the idea of their interactions must be generalized. What is exchanged during communication between agents can be different, not always necessarily words or written symbols. “

The notion of computation as a formal (mechanical) symbol manipulation originates from discussions in mathematics in the early 20th century. The most influential program for formalization was initiated by Hilbert, who treated formalized reasoning as a symbol game in which the rules of derivation are expressed in terms of the syntactic properties of symbols. As a result of Hilbert's program large areas of mathematics have been formalized. Formalization implies the establishment of the basic language which is used to formulate the axioms and derivation rules defined such that the important semantic relationships are preserved by inferences defined only by the syntactic form of expressions. Hilbert's

The basic idea was that any operations that are sensitive only to syntax can be simulated mechanically. What the human following a formal procedure/algorithm does by recognition of syntactic patterns, a machine can be made to do by purely mechanical means. Formalization and computation are closely related and together entail that reasoning which can be formalized can also be simulated by the Turing machine. Turing assumed that a machine operating in this way would actually be doing the same thing as the human performing computation.

The Turing machine is supposed to be given from the outset—its logic, its physical resources, and the meanings ascribed to its actions. The Turing machine model essentially presupposes a human as a part of the system—the human is the one who provides material resources, poses the questions, and interprets the answers.

The Church-Turing thesis states that Turing machine can perform any effective computation,

The Church-Turing thesis has been extended to a proposition about the processes in the natural world by Stephen Wolfram in his

One of the problems in the discussion about the Turing machine model is the word “machine” as the

Ever since Turing proposed his machine model which identifies computation with the execution of an algorithm, there have been questions about how widely the Turing machine model might be applicable. Church-Turing Thesis establishes the equivalence between a Turing machine and an algorithm, interpreted as to imply that all of computation must be algorithmic. However, with the advent of computer networks, the model of a computer in isolation, represented by a Turing machine, has become insufficient. Today's software-intensive and intelligent computer systems have become large, consisting of massive numbers of autonomous and parallel elements across multiple scales. At the nano-scale they approach programmable matter; at the macro scale, huge numbers of cores compute in clusters, grids or clouds, while satellite data are used to track global-scale phenomena. The common for these modern computing systems is that they are

At the moment, the closest to common acceptance is the view of

Unlike the Turing machine model which originates from Hilbert's program for logics and mathematics (proposed in the early 1920s), other types of models of computation such as process models (Petri nets, Process Algebra, and Agent-Based models) appeared in the past decades (the theory of Petri nets in 1962), specifically in Computer Science. Indicatively, the present day formal methods in Systems Biology include Rule-Based Modeling of Signal Transduction, Process Algebras, Abstract Interpretation, Model Checking, Agent-Based Modeling of Cellular Behavior, Boolean Networks, Petri Nets, State Charts and Hybrid Systems. However, concurrency models have emerged in a bottom-up fashion in order to tackle present day networks of computational systems and it will take a few years until they reach the shared world view as tools of thinking in a new paradigm of computation.

Natural computation/physical computation is a new paradigm of computing which deals with computability in the physical world, that has brought a fundamentally new understanding of computation [

Kampis for example, in his book

“a component system is a computer which,

Natural computing is characterized by a

Compared with new computing paradigms, Turing machines form the proper subset of the set of information processing devices, in much the same way that Newton's theory of gravitation is a special case of Einstein's theory, or Euclidean geometry is a limit case of non-Euclidean geometries.

Natural computation is a study of computational systems including the following: (1) Computing techniques that take inspiration from nature for the development of novel problem-solving methods; (2) Use of computers to simulate natural phenomena; and (3) Computing with natural materials (e.g., molecules, atoms).

Natural computation is well suited for dealing with large, complex, and dynamic problems. It is an emerging interdisciplinary area closely related with the artificial intelligence and cognitive science, vision and image processing, Neuroscience, Systems Biology, Bioinformatics—to mention but a few.

Fields of research within natural computing are among others Biological Computing/Organic Computing, Artificial Neural Networks, Swarm Intelligence, Artificial Immune Systems, computing on continuous data, Membrane Computing, Artificial Life, DNA computing, Quantum computing, Neural computation, Evolutionary computation, evolvable hardware, self-organizing systems, emergent behaviors, machine perception and Systems Biology.

Evolution is a good example of natural computational process. The kind of computation it performs is Morphological computation [

For computationalism,

According to pancomputationalism (naturalist computationalism) [

Natural computing has specific criteria for the success of a computation. Unlike the case of the Turing model, the halting problem is not a central issue, but instead

In many areas, we have to computationally model emergence not being algorithmic [

Much like the research in other disciplines of Computing such as AI, SE, and Robotics, Natural computing is interdisciplinary research, and has a synthetic approach, unifying knowledge from a variety of related fields. Research questions, theories, methods and approaches are used from Computer Science (such as Theory of automata and formal languages, Interactive computing), Information Science (e.g., Shannon's theory of communication), ICT studies, mathematics (such as randomness, Algorithmic theory of information), Logic (e.g., pluralist logic, game logic), Epistemology (especially naturalized epistemologies), evolution and Cognitive Science (mechanisms of information processing in living organisms) in order to investigate foundational and conceptual issues of Natural computation and information processing in nature.

“(O)ur task is nothing less than to discover a new, broader, notion of computation, and to understand the world around us in terms of information processing ” [

This development necessitates what [

Konrad Zuse was the first to suggest (in 1967) that the physical behavior of the entire universe is being computed on a basic level, possibly on cellular automata, by the universe itself which he referred to as “Rechnender Raum” or Computing Space/Cosmos. Consequently, Zuse was the first pancomputationalist (naturalist computationalist). Here is how Chaitin explains pancomputationalism:

“And how about the entire universe, can it be considered to be a computer? Yes, it certainly can, it is constantly computing its future state from its current state, it's constantly computing its own time-evolution! And as I believe Tom Toffoli pointed out, actual computers like your PC just hitch a ride on this universal computation!” [

Fredkin in his

Even Wolfram in his

Wolfram's critics remark however that cellular automata do not evolve beyond a certain level of complexity. The mechanisms involved do not necessarily produce evolutionary development. Actual physical mechanisms at work in the physical universe appear to be quite different from simple cellular automata. Critics also claim that it is unclear if the cellular automata are to be thought of as a metaphor or whether real systems are supposed to use the same mechanism on some level of abstraction. Wolfram meets this criticism by pointing out that cellular automata are models, and as such, surprisingly successful ones.

In his article on Physical Computation for Stanford Encyclopedia of Philosophy, Piccinini [

The

As for the

Info-computationalism is in Piccinini's scheme kind of limited computationalism based on the third source:

“A third alleged source of pancomputationalism is that every physical state carries information, in combination with an information-based semantics plus a liberal version of the semantic view of computation. According to the semantic view of computation,

The use of the word “manipulation” seems to suggest a conscious intervention (“the practice of manipulating”, Wiktionary), while computation in general is a dynamical process that drives (through the interaction mechanisms) changes in informational structures. Notwithstanding Piccinini's skepticism there are well established theories in computer science, see [

One of the frequent criticisms of computational approaches is based on the understanding that computation always must be discrete and that some continuous processes (as human mind) can never be adequately represented by computational models. Here, several confusions act together, and it is useful to disentangle them.

Wolfram and Fredkin assume that the universe on a fundamental level is a discrete system, and so a suitable basis for an all-encompassing digital computer. But the hypothesis about the discreteness of the physical world is not decisive for pancomputationalism/naturalist computationalism. As is well known, besides digital there are analog computers. On a quantum-mechanical level, the universe performs computation [

“as we delve deeper and deeper into both natural and artificial processes, we find the nature of the process often alternates between analog and digital representations of information. As an illustration, I noted how the phenomenon of sound flips back and forth between digital and analog representations. (…) At a yet deeper level, Fredkin, and now Wolfram, are theorizing a digital (i.e., computational) basis to these continuous equations. It should be further noted that if someone actually does succeed in establishing such a digital theory of physics, we would then be tempted to examine what sorts of deeper mechanisms are actually implementing the computations and links of the cellular automata. Perhaps, underlying the cellular automata that run the Universe are yet more basic analog phenomena, which, like transistors, are subject to thresholds that enable them to perform digital transactions” [

Now it should be emphasized that “computational” is not identical with “digital”, and Maley [

Lloyd makes a statement equivalent to Calude's [

“In a quantum computer, however, there is no distinction between analog and digital computation. Quanta are by definition discrete, and their states can be mapped directly onto the states of qubits without approximation. But qubits are also continuous, because of their wave nature; their states can be continuous superpositions. Analog quantum computers and digital quantum computers are both made up of qubits, and analog quantum computations and digital quantum computations both proceed by arranging logic operations between those qubits. Our classical intuition tells us that analog computation is intrinsically continuous and digital computation is intrinsically discrete. As with many other classical intuitions, this one is incorrect when applied to quantum computation. Analog quantum computers and digital quantum computers are one and the same device” (Lloyd, 2006).

Thus establishing a digital basis for physics at certain level of granularity, will not resolve the philosophical debate as to whether physical universe is ultimately digital or analog. Nonetheless, establishing a feasible computational model of physics would be a major achievement.

Let us start from the assertion that a dichotomy exists between the discrete and continuous nature of reality in classical physics. From the cognitive point of view it is clear that most of the usual dichotomies are coarse approximations. They are useful, and they speed up our perception and reasoning considerably. Following Kant, however, we can say that “Ding an Sich” (thing-in-itself) is nothing we have knowledge of. This is also true in the case of the discrete-continuous question. Human cognitive categories are the result of natural evolutionary adaptation to the environment. Given the bodily morphology we have, they are definitely strongly related to the nature of the physical world which we live in, but they are by no means general tools for understanding the universe as a whole at all levels of organization and for all types of phenomena that exist. If we adopt the dichotomy as our own epistemological necessity, how could the continuum/discrete universe be understood?

In what follows I will argue that discrete and continuous are dependent upon each other—that logically there is no way to define the one without the other. Let us begin by assuming that the basic physical phenomena are discrete. Let us also assume that they appear in finite discrete quanta, packages, amounts or extents. If the quanta are infinitely small then they already form a continuum. So in order to get discretization, quanta must be finite.

Starting with finite quanta one can understand the phenomenon of continuum as a consequence of the processes of (asynchronous) communication between different systems. Even if the time interval between two signals that one system produces has always some definite value different from zero, (discrete signals), two

Abramsky summarizes the process of changing paradigm of computing as follows:

“Traditionally, the dynamics of computing systems, their unfolding behavior in space and time has been a mere means to the end of computing the

This also suggests a possibility that a lack of an “adequate structural theory of processes” may explain the lack of fundamental progress in theory of complexity. For example, Calude in [

According to [

If computation is understood as a physical process, if Nature computes with physical bodies as objects (informational structures) and physical laws governing process of computation, then the computation necessarily appears on many different levels of organization. Natural sciences provide such a layered view of Nature. One sort of computation processes will be found on the quantum-mechanical levels of elementary particles, atoms and molecules; yet another on the level of classical physical objects. In the sphere of biology, different processes (computation, information processing) are going on in biological cells, tissues, organs, organisms, and eco-systems. Social interactions are governed by still another kind of communicative/interactive processes. In short, computation on a given level of organization is implementation of the laws that govern the interaction between different parts that constitute that level. Consequently, what happens on every next level of organization is that a set of rules governing the system switches to the new level. How exactly this process goes on in practice, remains yet to learn. Recently, simulation tools are being developed which allow for study of the behavior of complex systems modeled computationally. For the analysis of the time development of dynamic systems various simulation techniques are developed, from purely mathematical approaches, e.g., equation based modeling, simulated by iterative evaluation, to formal modeling approaches, such as Petri Nets and Process Algebra together with Object-oriented and Agent-oriented simulation methods based on the emulation of constituent system elements.

One of the criticisms of pancomputationalism based on cellular automata is presented by Kurzweil, and concerns complexity. Cellular automata are a surprisingly fruitful model, and they have led to the development of a new kind of scientific method—generative modeling [

“Wolfram considers the complexity of a human to be equivalent to that a Class 4 automaton because they are, in his terminology, “computationally equivalent.” But Class 4 automata and humans are only computationally equivalent in the sense that any two computer programs are computationally equivalent, i.e., both can be run on a Universal Turing machine. It is true that computation is a universal concept, and that all software is equivalent on the hardware level (i.e., with regard to the nature of computation), but it is not the case that all software is of the same order of complexity. The order of complexity of a human is greater than the interesting but ultimately repetitive (albeit random) patterns of a Class 4 automaton” [

My comments are the following: What seems to be missing in cellular automata is the dependence of the rules (implemented in the process of computation) on the underlying structure of the system. In a physical system such as magnetic iron, the mean electromagnetic field of the whole system affects each of its parts (magnetic atoms in a crystal lattice), so interactions are not only local, with the closest neighbors but reflect also global properties of the system. Cellular automata are synchronously updated, which according to Sloman makes them computationally less expressive than systems with asynchronous interactions. Agent based models which are currently developed are generalizations of cellular automata and they can avoid those limitations. They are essentially decentralized, bottom-up, in general asynchronous models. (Synchronous communication where agents exchange information all at the same time is a special case of asynchronous information exchange. The behavior is defined at individual agent level, and the global behavior emerges as a result of interaction among numerous individuals communicating with each other and with the environment. For a quick and concise introduction to Agent Based Models, see for example Castiglione's article in Scholarpedia.)

In short, solutions are being sought in natural systems with evolutionary developed strategies for handling complexity in order to improve modeling and construction of complex networks of massively parallel autonomous engineered computational systems. The research in the theoretical foundations of Natural computing is needed to improve understanding on the fundamental level of computation as information processing which underlie all of the computing in nature.

The mutually interrelated objectives of this paper were to:

Offer computational interpretation of information dynamics of the info-computational universe;

Suggest the necessity of generalization of the models of computation beyond the traditional Turing machine model and acceptance of “second generation” models of computation;

Elucidate a number of frequent misunderstandings when it comes to the models of computation and their relationships to physical computational systems;

Argue for info-computationalism as a new philosophy of nature providing a basis for unification of currently disparate fields of natural, formal and technical sciences;

Answer some of the most prominent criticisms of naturalist computationalism.

The developments supporting info-computational naturalism are expected from among others Complexity Theory, Theory of Computation (Organic Computing, Unconventional Computing), Cognitive Science, Neuroscience, Information Physics, Agent Based Models of social systems and Information Sciences as well as Bioinformatics and Artificial Life [