1. The Necessity of an Agent (Observer/Actor) for Knowledge Generation
What can we hope for from studies of information related to energy/matter (as it appears for us in space/time)? Information is a concept known for its ambiguity in both common, everyday use and in its specific technical applications throughout different fields of research and technology. However, most people are unaware that matter/energy today is also a concept surrounded by a disquieting uncertainty. What for Democritus were building blocks of the whole universe appear today to constitute only 4% of its observed content. (NASA 2012) [
1] The rest is labeled “dark matter” (conjectured to explain gravitational effects otherwise unaccounted for) and “dark energy” (introduced to account for the expansion of the universe). We do not know what “dark matter” and “dark energy” actually are. This indicates that our present understanding of the structure of the physical world needs re-examination.
However, by connecting two much debated sets of ideas—information and matter/energy—can we hope to gain more clarity and some new insight? I believe so! This special issue contains contributions with beautiful examples of clarity and strength of argument, which helps us in assembling the jigsaw puzzle of relationships between information and energy/matter.
One of the approaches discussed leads to formulating physics in terms of information. The idea goes back to Wheeler’s “it from bit” [
2], developing the idea that, for us, information is the fabric of the universe. Our reality results from the processing of information of multimodal signals coming through our senses in combination with information processes in our bodies and brains.
Informational reality is the subject of a steady stream of new books; prominent examples are [
3,
4,
5,
6,
7,
8].
4. It from (Qu)bit
The first topic in focus in this special issue is physics as science of information in which papers of Goyal, Vedral, Fields and Zenil can be placed. The essential goal of Goyal’s paper is to examine the “it from bit” idea from the point of view of quantum theory, to clarify why and how information acquires a central role when moving from classical physics to quantum physics, and how precisely the information point of view is allowing us to more deeply understand the nature of quantum reality (both through using quantum theory to implement informational protocols, and through trying to formulate and use information-inspired principles to derive predictions of quantum theory).
Vedral contributes with a discussion of the question of what comes first, physics or information, arguing that both physics and science of information profit from interaction with each other. Zenil’s article presents a survey of some characteristics of the thermodynamics of computation and relates information, energy, computability and statistical mechanics. He finally suggests research into information and computation applied to biology, which until now successfully resisted the famous “unreasonable effectiveness of mathematics in the natural sciences” [
11] proclaimed by Wigner. Fields investigates the consequences of the physics understood as information science, elaborating the question of the nature of an observer modeled more realistically than the Galilean “point of view”. He shows how the observations recorded by an observer that satisfies the requirements of classical automata theory will display the characteristics predicted by quantum theory.
When studying the physical character of information, it is good to remind oneself that the physical “has a variety of overlapping meanings (a Wittgensteinian family resemblence). For example, Quine takes the physical to be anything accessible to the senses or inferences thereof. Ladyman, Ross, Collier and Spurrett take the physical to be the most fundamental laws of our (part of) the universe. (…) Information is at least physical in both of these senses. Quine's approach might make it entirely physical. I prefer to relate it to the causal, which always has physical parameters, as far as we know. But there are many ways of approaching this issue, and disentangling them will be a major advance in foundations of information theory”, as argued by Collier [
12].
Luhn, in this issue, follows this approach to information grounded in physics as the causal and explains novelty as compositional. This proposed concept of information is developed based on ideas of Burgin, Brenner and Floridi.
On the quantum mechanical level of physical reality there are qubits, “So, what does that leave us with? Not “something for nothing”: Information does not create the world
ex nihilo; nor a world whose laws are really just fiction, so that physics is just a form of literary criticism. But a world in which the stuff we call information, and the processes we call computations, really do have a special status.” by Deutsch [
9]. This emphasis on the dual nature of information/computation is found in Dodig Crnkovic [
13] as well as Zenil [
14].
5. Self-Organized Complexity, Jaynesian Observers and Logic in Reality
Apart from the direct answers to the question of physical nature of information and informational nature of physics, the topic of this special issue provoked responses ranging from the role of an observer in the self-organized complexity, mechanisms of chemical affinities relevant for the origins of life, to the issues of representation and logic. All of the articles touch upon the role of an observer or rather an agent undergoing interaction with the world, be it a cognizing agent as a researcher generating theory, or a simple physical system like a chemical molecule interacting with its surroundings.
Matsuno and Salthe’s “observers” (agents) are molecules involved in two types of interactions used for studying chemical affinity as material agency for naturalizing contextual meaning. An important natural example of this material agency is suggested as responsible for the origins and evolution of life.
Yoshitake and Saruwatari see the interaction of an agent with the world as an act of measurement. They address articulation of information from the physical world via interactions alone, generalizing the extensive measurement theory.
Brenner’s take on representation in information theory contains a discussion of the limitations of the standard theories of representation and argues for a more realistic picture of informational systems based on information as an energetic process. Brenner proposes an extension of logic to complex real phenomena in his Logic in Reality (LIR), which provides explanations for the evolution of complex processes, including information. Rather than via structural relationships, information in Brenner’s approach is identified by its dynamics, or better, by the fact that its structure is dynamic. This approach can be compared with the Dodig-Crnkovic view of information and computation as two complementary principles [
13], where information corresponds to structure while computation stands for dynamics. This approach “unpacks” Brenner’s information into its structural and dynamical aspects.
Brenner contributes to this special issue even with an informative review of Mark Burgin’s book Theory of Information: Fundamentality, Diversity and Unification.
An information-based, epistemic, inductive probability was developed by physicist E. T. Jaynes who used the maximum entropy inference (or logic) to convert prior information for an observer into prior probability distribution. While Jaynes applied probability theory to the problems of inference facing scientists (notably in statistical mechanics), both Phillips and Fiorillo, in separate articles, consider the implications of Jaynes's work for our understanding of how the brain can perform inference.
Phillips’s paper studies how complexity can be adaptively self-organized by using probabilistic context-sensitive inference. Jaynes’s probabilistic inference is seen not only as the logic of science; but generally as the logic of life. Phillips concludes that the theory of Coherent Infomax specifies a general objective for probabilistic inference, and that contextual interactions in neural systems perform functions required of the scientist within Jaynes’s theory.
Fiorillo, too, argues for Jaynes’s approach, in which probabilities are always conditional on a state of knowledge of an agent through the rules of logic, as expressed in the maximum entropy principle which provides the objective means for deriving probabilities, as well as a unified account of information and logic (knowledge and reason). Fiorillo’s article suggests how to identify a probability distribution over states of one physical system (an “object”) conditional only on the biophysical state of another physical system (an “observer”). Even though the aim is to show “what it means to perform inference and how the biophysics of the brain could achieve this goal”, this approach exquisitely exemplifies the idea of relational information as defined by Hewitt [
15].