Next Article in Journal
Surface and Groundwater Quality in South African Area—Analysis of the Most Critical Pollutants for Drinking Purposes
Previous Article in Journal
The Orderliness of Music from the Perspective of Complex Information Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Discerning Potential and Impact Information †

Department of Mathematics, University of California, Los Angeles, 520 Portola Plaza, Los Angeles, CA 90095, USA
Conference Theoretical Information Studies, Berkeley, CA, USA, 2–6 June 2019.
Proceedings 2020, 47(1), 4; https://doi.org/10.3390/proceedings2020047004
Published: 13 May 2020
(This article belongs to the Proceedings of IS4SI 2019 Summit)

Abstract

:
This paper contains further analysis of the concept of information aimed at discovering new features of this mysterious but very important phenomenon. We base our analysis on the general theory of information and contemporary theoretical physics. This approach allows for the explication of two basic complementary types of information—potential and impact information. In such a way, we achieve a better understanding of information as a natural and social phenomenon, which serves as a base for developing novel tools for measuring information.

1. Introduction

In spite all its importance, information remains a mysterious phenomenon, many properties of which are still waiting to be discovered. Looking at physics, we see that many significant properties of physical systems were discovered by the utilization of physical theories. The same is true for studies of information where there are different theories of information. The most comprehensive is the general theory of information, which encompasses many known information theories, including, among others, Shannon’s information theory, algorithmic information theory, quantum information theory and economic information theory [1]. The general theory of information illuminates intrinsic relations between information theory and physics, making possible to use physical theories for exploration of information. Here, we use this theory for the analysis of information dynamics, starting our exposition with the necessary information about the general theory of information.

2. Elements of the General Theory of Information

The general theory of information (GTI) is an innovative theoretical system with three components: the axiomatic foundation, mathematical core and functional hull [1].
The axiomatic foundation of the general theory of information consists of principles, postulates and axioms as well as of their consequences:
Principles describe and explain the essence and main regularities of the information terrain.
Postulates are formalized representations of these principles.
Axioms describe mathematical structures used in the general theory of information.
The mathematical core of the general theory of information consists of mathematical theories based on the axiomatic foundation of the general theory of information.
The functional hull of the general theory of information contains informal theories based on the axiomatic foundation of the general theory of information, as well as applications of the general theory of information.
There are two classes of principles, the formalized representation of which constitute postulates of the general theory of information:
  • Ontological principles clarify the essence of information as a natural and artificial phenomenon.
  • Axiological principles describe how to evaluate information and what measures of information are necessary.
There are seven main and several additional ontological principles, which are divided into three groups:
  • Substantial principles [O1, O2 and its modifications O2g, O2a, O2c] define information.
  • Existential principles [O3, O4, O7] describe how information exists in the physical world.
  • Dynamical principles [O5, O6] show how information functions.
In what follows, we use ontological principles O2 and O2g.
Ontological Principle O2 (the General Transformation Principle). In a broad sense, information fora system R is a capacity to cause or prevent changes in the system R.
Thus, we may understand information in a broad sense as a capacity (ability or potency) of things, which can be material, mental or abstract, to change other things or to prevent changes. In such a way, the ontological principle O2 provides a definition of information in a broad sense.
Our experience and scientific studies exhibit that information exists in the form of portions or pieces of information. For instance, we have such portions or pieces of information as information in a word, information in a letter or information in a book.
It is possible to define a portion of information as information contained in one information carrier. Examples of information carriers are physical letters, the memory of a computer, a database, or flashcard.
Our experience and scientific studies exhibit that information exists in the form of portions or pieces of information. For instance, we have such portions or pieces of information as information in a word, information in a letter or information in a book.
It is possible to define a portion of information as information contained in one information carrier. Examples of information carriers are physical letters, the memory of a computer, a database, or flashcard.
In a similar way, it is possible to define a piece of information as information contained in one information representation. Examples of information representations are texts, signs, symbols or speeches.
There is also the concept of a slice of information, which is a portion of information about some object (domain, system or subject). Examples of information slices are information about the Earth, information about the Moon or information about Sherlock Holmes.
Algorithmic complexity C(x) is a measure of a slice of computational information about object x.
The ontological principle O2 unifies dynamic aspects of reality because information in a broad sense is projected onto three primal components of reality:
  • physical reality
  • Mental reality
  • Structural reality
As a result, the ontological principle O2 also amalgamates the following three fundamental conceptions into one comprehensive concept:
  • Information
  • Physical energy
  • Mental energy
Being extremely wide-ranging, this definition supplies meaning to and an explanation of:
  • The conjecture of von Weizsäcker that energy might in the end turn out to be information.
  • The aphorism of Wheeler It from Bit.
  • The statement of Smolin that the three-dimensional energetic world is the flow of information.
The ontological principle O2 unifies three fundamental concepts but does not allow for differentiation between them. Thus, we need one more principle for identifying information in the strict sense.
Ontological Principle O2g (the Relativized Transformation Principle). Information for a system R relative to the infological system IF(R) is a capacity to cause or prevent changes in the system IF(R).
The concept of infological system plays the role of a free parameter in the general theory of information, providing for representation of different kinds and types of information in this theory. This is why the concept of infological system, in general, should not be limited by boundaries of exact definitions. A free parameter must really be free. Identifying an infological system IF(R) of a system R, we can define different kinds and types of information.
To build an efficient model of information, it would be dexterous to employ the concept of an infological space, points of which would serve as representations of the states of an infological system.

3. Varieties of Information

According to the second ontological principle O2 (the General Transformation Principle), of the general theory of information, information plays the same role in the World of Structures as energy plays in the Physical (material) World [2]. Physicists studied energy much longer than researchers explored information. It makes learning what physicists discovered about energy and applying this knowledge to the study of information possible.
In general, physicists treat energy as work capability or the ability to do work (cf., for example, [3]). In this context, work is viewed as physical activity involving physical effort done to produce some organized changes in physical systems. At the same time, in many situations, it is necessary to do work to prevent changes. For instance, an individual receives information that James Clerk Maxwell was a great physicist who was born in 1831 and passed away in 1879. Some people can remember this information without effort. However, the majority needs to perform definite work, for example, repeating these data, writing them down on paper or in a computer, to preserve this information in the memory. Moreover, when time passes, people are prone to forgetting. As a result, the preservation of information demands performing new work [4].
Another example of work done for preservation is related actually to any building. Natural forces and sometimes people tend to destroy it. Keeping it the same, i.e., eliminating changes, demands definite restoration work. Besides, we know that the preservation of endangered species in nature also demands some work.
Note that preservation is the prevention of changes in an existing system and it is possible to consider the prevention of changes in this system as a change in the conditional configuration space.
However, energy does not directly cause or prevent changes. To do this, energy has to be transformed (converted) to force and then this force changes a physical system or preserves it from changes.
Analyzing the concept of information with the ontological principles of the general theory of information, we find that, according to the second ontological principle O2a (the Special Transformation Principle), information, per se, is the capacity or capability to change (transform) structural systems, such as knowledge systems [5,6]. At the same time, we know that this capacity can be actual, when information induces changes (causes transformation), or potential, when there are no actual changes (transformation). This gives two types of information—potential information and impact information.
Potential information IP reflects only a possibility of changes (transformations), while impact information, or information force, IF performs changes (transformation).
In physics, energy is measured using the concept of work and physicists elaborated exact mathematical descriptions of work. In kinematics, work is defined by a spatial integral of the force acting on an object and is described by the following mathematical formula:
W = F ¯ d x ¯
In this formula, ͞F denotes the force acting on the object and ͞x denotes the special displacement of the object.
When the force ͞F is constant, we have the simpler formula:
W = ͞FS
where S is the displacement.
In this context, it is possible to define information work WI (which is also called structural work) as the size (amount) of structural changes (transformation) times the measure (amount) of impact information.
In physics, forces and infinitesimal displacements have direction and thus are modeled by vectors. In the case of information, if the infological space of action is modeled by a vector space or a topological manifold, impact information and infological displacement also become vectors.
Let us apply this concept to epistemic information, which changes the knowledge of a system, because information theory elaborated definite measures of knowledge and knowledge transformations. We will use here the epistemic measure studied in [5,7].
To measure epistemic displacement, we describe an epistemic space as a network of finite sets (systems) of knowledge units defining the distance between two units equal to 1 as the first approximation to the epistemic metrics. For instance, if knowledge is represented by logical formulas of the proposition calculus, we take elementary propositions as units of knowledge. It is also possible to take all propositions as units of knowledge.
At first, let us figure out how to measure displacements in an epistemic space. The transformation of knowledge systems is performed by two elementary operations: the addition of a knowledge unit and the elimination of a knowledge unit. Each such operation determines a displacement of the length equal to 1. Thus, when a knowledge system R is transformed into a knowledge systems P, the displacement is equal to the number of elementary operations performed in this transformation. We call this process P knowledge unit exchange and measure it by the number of performed elementary operations N(P).
Assuming that the transformation (change) is performed under the action of impact information (information force), we have to measure this force. In the case of epistemic information, it is natural to suppose that information comes to the system in the form of data. Then, it is possible to measure the force by the size of input data, e.g., in bits or in bytes.
This gives us the following mathematical formula for information work:
WI = m(IF)⋅D
where D is the structural displacement.
Examples of other measures of epistemic information are Shannon’s entropy [8], Hartley measure [9], Rényi entropy [10], Fisher information [11] and some others. It would be interesting to build measures for information work using these measures.
After defining information force, a crucial question is how to measure it. Let us consider this problem in the context of such a mental infological system as a thesaurus or system of knowledge. In this case, there are several measures to estimate changes of knowledge—cognitive displacement. As we already discussed, it is possible to use Shannon’s entropy or knowledge unit exchange for this purpose.
Let us consider two processes—learning some material and decoding a coded text—to measure information force. In these cases, it is possible to suppose that information force IF is proportional to the mental effort Eff in this process, i.e., m(IF) = k·m(Eff). Thus, to find the measure of the information force, we need to determine the coefficient of proportionality k and to measure the mental effort in these processes.
The mental effort of an individual who learns some portion of knowledge or decodes a coded text can be measured by the time of this learning or decoding. It is also possible to determine mental effort measuring the intensity of the brain activity during learning or in the process of decoding.
In general, and in these processes in particular, it is essential to discern instantaneous information force and average information force, as well as instantaneous mental effort and average mental effort.

4. Conclusions

Two basic complementary types of information—potential and impact information—are described and studied. Adequate tools for measuring these two types of information are discussed. The next step in this direction is the elaboration of a mathematical theory of potential and impact information, in which the latter can be modeled by action vectors or operators in an infological space.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Burgin, M. Theory of Information: Fundamentality, Diversity and Unification; World Scientific: New York, NY, USA; London, UK; Singapore, 2010. [Google Scholar]
  2. Burgin, M. Information in the Structure of the World. Inf. Theor. Appl. 2011, 18, 16–32. [Google Scholar]
  3. Shope, R.K. Physical and Psychic Energy. Philos. Sci. 1971, 38, 1–12. [Google Scholar] [CrossRef]
  4. Anderson, J.R.; Bower, G.H. Human Associative Memory; V.H. Winston and Sons: Washington, DC, USA, 1973. [Google Scholar]
  5. Burgin, M. Weighted E-Spaces and Epistemic Information Operators. Information 2014, 5, 357–388. [Google Scholar] [CrossRef]
  6. Burgin, M. The General Theory of Information as a Unifying Factor for Information Studies: The noble eight-fold path. Proceedings 2017, 1, 164. [Google Scholar] [CrossRef]
  7. Mizzaro, S. Towards a theory of epistemic information. In Information Modelling and Knowledge Bases, 12; IOS Press: Amsterdam, The Netherlands, 2001; pp. 1–20. [Google Scholar]
  8. Shannon, C.E. The Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  9. Hartley, R.V. Transmission of information. Bell Syst. Tech. J. 1928, 7, 335–363. [Google Scholar] [CrossRef]
  10. Rényi, A. On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics; The Regents of the University of California: Berkeley, CA, USA, 1961; pp. 547–561. [Google Scholar]
  11. Fisher, R.A. Statistical Methods and Scientific Inference; Oliver and Boyd: Edinburgh, UK, 1956. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Burgin, M. Discerning Potential and Impact Information. Proceedings 2020, 47, 4. https://doi.org/10.3390/proceedings2020047004

AMA Style

Burgin M. Discerning Potential and Impact Information. Proceedings. 2020; 47(1):4. https://doi.org/10.3390/proceedings2020047004

Chicago/Turabian Style

Burgin, Mark. 2020. "Discerning Potential and Impact Information" Proceedings 47, no. 1: 4. https://doi.org/10.3390/proceedings2020047004

Article Metrics

Back to TopTop