# Structural and Symbolic Information in the Context of the General Theory of Information

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Basic Postulates of the General Theory of Information (GTI)

**Ontological Principle O1 (the Locality Principle).**It is necessary to separate information, in general, from information (or a portion of information) for a system R.

**Ontological Principle O2 (the General Transformation Principle).**In a broad sense, information for a system R is a capacity to cause changes in the system R.

**Ontological Principle O2g (the Relativized Transformation Principle).**Information for a system R relative to the infological system IF(R) is a capacity to cause changes in the system IF(R).

**Ontological Principle O2a (the Special Transformation Principle).**Information in the strict sense or proper information or, simply, information for a system R, is a capacity to change structural infological elements from an infological system IF(R) of the system R.

**Ontological Principle O2c (the Cognitive Transformation Principle).**Cognitive information for a system R, is a capacity to cause changes in the cognitive infological system IFC(R) of the system R.

**Ontological Principle O3 (the Embodiment Principle).**For any portion of information I, there is always a carrier C of this portion of information for a system R.

**Ontological Principle O4 (the Representability Principle).**For any portion of information I, there is always a representation Q of this portion of information for a system R.

**Ontological Principle O5 (the Interaction Principle).**A transaction/transition/transmission of information goes on only in some interaction of the carrier C with the system R.

**Ontological Principle O6 (the Actuality Principle).**A system R accepts a portion of information I only if the transaction/transition/transmission causes corresponding transformations in R or in its infological system IF(R).

**Ontological Principle O7 (the Multiplicity Principle).**One and the same carrier C can contain different portions of information for one and the same system R.

## 3. Structural Information in the Context of the General Theory of Information

- Inherent structural information is information in structures.
- Descriptive structural information is information about structures.
- Constructive structural information is information that allows building knowledge about structures.

- Structural information can be more correct or less correct.

**Example**

**1.**

**Example**

**2.**

- 2.
- As a rule, structural information about a system is not unique.

- 3.
- Structural information about a system can be inherent to this system, built into the interaction with the system or innate for an image of the system.

- 4.
- Processes in a system can change structural information about this system.

- 5.
- Structural information about a system describes this system to a definite extent of precision, i.e., structural information can be more precise and less precise.

- 6.
- For complex systems, it is possible to consider structural information on different levels and various scales.

- 7.
- Structural information about a subsystem of a system is not always a part of the structural information about this system.

- 8.
- The process of conversion of structural information about a system into knowledge about this system is, in essence, structuration of this system.

## 4. Structural Information as an Intrinsic Property

## 5. Symbolic Information

**Definition**

**1.**

- (i)
- Discrete symbols are robust against small perturbations, i.e., symbols may be replaced by similar imitations. In simple information-processing systems, the receiver may be a dynamical system to which an incoming symbol appears as an imposed boundary or initial condition. Then, the system will approach an associated attractor state, which physically represents the meaning of the symbol. Often, the attraction basin, i.e., the set of conditions leading to the same attractor, is a compact set, and slightly modified symbols within that basin will cause the same attractor to be reached. As an example, written letters are recognized as being equal even if their symbols are displayed in different fonts, sizes, or colors. However, irregular handwriting, distortion or damage can essentially change interpretation of symbols by the receiver.
- (ii)
- Reading a symbol can refresh it, permitting largely lossless copies to be produced if the refreshment happens within the physical lifetime of the symbol(s). Multiplying cells and organisms, but also computer memories, implement the refreshment technique for safe long-term data storage.
- (iii)
- Robustness against small symbol perturbations permits dialects to evolve which increasingly use modified symbols that are similar in the sense that upon reading, they produce the same results as their originals. In turn, this process permits gradual deformation of the attraction basin, or even spawning of new basins, that is, drift and diversification of symbols; “If signals are under selection for efficient transmission between signaler and receiver, then populations inhabiting environments that differ in their effect on signal transmission or detection are expected to evolve different signals—a process known as ‘sensory drive’” [44].
- (iv)
- Symbolic information is conventional. A system of symbols may be replaced by a completely different set of symbols if this transformation is simultaneously applied to the message, the transmitter and the receiver. On a Chinese tablet computer, Chinese letters, their Latin transcription, and the related binary machine code are permanently exchanged by one another while a tablet is used. Genetic DNA or RNA bases, together with their complementary strains, represent the same information. Symbolic information is invariant against such arbitrary symbol transformations (substitutions).
- (v)
- The replacement of a symbol by a physically different one, either with the same or with a different meaning, is energetically practically neutral. Symbols are “energy-degenerate” [14]. Any forces driving a modified message back to some fictitious distinguished “equilibrium message” are virtually absent. Physically formulated, so-called Goldstone modes with vanishing Lyapunov exponents appear along with the emergence of symbols (a process termed the ritualization transition [7,8], and permit exceptionally large fluctuations. Thermodynamically, particular messages appear as alternative “microstates” that populate a “Boltzmann shell” of an information processing system; “In principle, a sequence of symbols—such as the text of a given novel—represents a microstate” [16]. In fact the famous Boltzmann formula for the thermal entropy, S = k log W, of an equilibrium system with W microstates equals Shannon’s formula for the information capacity if converted to the unit “bit” [5].
- (vi)
- As a result of the coincidence of structural and symbolic information immediately at the ritualization transition, in the Goldstone modes the structural information of the symbols keep a trace of the evolution history of the symbolic information system, until this trace may gradually be eroded by fluctuations and neutral drift. The physical form of symbols expresses and reveals their historicity.
- (vii)
- Looking at symbols from the symbolic side, the code symmetry impedes conclusions to be drawn from the meaning of the information on the physical properties of symbols. Running a computer program does not permit deciding whether the memory bits are stored by, say, charging electrical capacitors or swapping magnetic fields. Introspection of our mind while thinking does not offer any clues on which transmitter substances may be released between synapses, or on the nature of nerve pulse propagation; “The faculty with which we ponder the world has no ability to peer inside itself or our other faculties to see what makes them tick” ([45] p. 4).
- (viii)
- Looking at symbols from the structural side, the code symmetry impedes conclusions to be drawn from the structure of the symbols on the meaning of the symbolic message. “Information and computation reside in patterns of data and in relations of logic that are independent of the physical medium that carries them” ([45] p. 24). This means, symbolic information is an emergent property. The same message may be expressed by different symbols, tokens or languages; a sequence of symbols may be reversibly compressed or redundantly inflated without affecting the meaning of the message. In order to produce a cup of coffee, a single on/off bit may be sent to a coffee machine, or a long instruction may be given to an unexperienced cook to prepare an equivalent result. The same mathematical problem may be solved by very different program codes whose mutual equivalence remains elusive without knowledge about the receiver, namely the rules how to compile and execute the code and to convert the message back into structural information. This position differs from opinions like that of the sharp thinker Pearson ([46] p. 50) that “we may say … without any dogmatic assumption that psychical effects can all be reduced to physical motion”.
- (ix)
- Added redundancy, such as partial repetition or embedded grammatical rules combined with orthographic vocabularies, leaves the meaning of symbolic information immediately unaffected but allows additional information-protection tools to evolve for error-detection and -correction of random perturbations. During later stages after the ritualization, such tools partially counteract the neutral drift of symbols and constrict the set of available Goldstone modes. About half of written English text represents syntactic redundancy [47].
- (x)
- Information processing of discrete symbols is performed by digital computers of any physical kind. Although Turing asserted, “This special property of digital computers, that they can mimic any discrete state machine, is described by saying that they are universal machines” ([48] p. 441), this is not true because digital computers cannot simulate (mimic) discrete state machines with an infinite number of states. Some think that computational universality suggests the possibility of simulating the human brain on an electronic computer [49]. However, this is also incorrect because nobody proved that the human brain is a discrete state machine. Moreover, it was demonstrated that even universal Turing machines are not universal in the realm of algorithms and automata because there are super-recursive algorithms, which are more powerful than any Turing machine [50,51].

- (i)
- Symbolic information systems possess a kind a partial symmetry, the carrier invariance In many situations, it is possible to copy information without loss to other carriers or to multiply it in the form of an unlimited number of physical instances. The information content seems independent of the physical carrier used. However, many linguists doubt a possibility of exact translation of the text from one natural language to another one (cf. for example, [52,53]). This problem is also apparent as the absence of automatic translators between high-level programming languages, such as Fortran, Pascal, or C, demonstrates.
- (ii)
- Symbolic information systems possess a new symmetry, the coding invariance. The functionality of the processing system is unaffected by substitution of symbols by other symbols as long as unambiguous bidirectional conversion remains possible. In particular, the stock of symbols can be extended by the addition of new symbols or the differentiation of existing symbols. At higher functional levels, code invariance applies similarly also to the substitution of groups of symbols, synonymous words or of equivalent languages.
- (iii)
- Within the physical relaxation time of the carrier structure, discrete symbols represent quanta of information that do not degrade and can be refreshed unlimitedly.
- (iv)
- Imperfect functioning or external interference may destroy symbolic information but only life-based processing systems can generate new or recover lost information.
- (v)
- Symbolic information systems consist of complementary physical components that are capable of producing the structures of each of the symbols in an arbitrary sequence upon writing, of keeping the structures intact over the duration of transmission or storage, and of detecting each of those structures upon reading the message. If the stock of symbols is subject to evolutionary change, a consistent co-evolution of all components is required.
- (vi)
- Symbolic information is an emergent property; its governing laws are beyond the framework of physics, even though the supporting structures and processes do not violate physical laws.
- (vii)
- Symbolic information often has a meaning or purpose beyond the scope of physics.
- (viii)
- In their structural information, the constituents of the symbolic information system preserve a frozen history (“fossils”) of their evolution pathway.
- (ix)
- Symbolic information processing is an irreversible, non-equilibrium processes that produces entropy and requires free-energy supply.
- (x)
- Symbolic information is encoded in the form of structural information of its carrier system. Source, transmitter and destination represent and transform physical structures.
- (xi)
- Symbolic information exists only in the context of life although this life can be natural or artificial.

**Definition**

**2.**

## 6. Relations and Interactions between Symbolic Information and Structural Information

- (i)
- Measure-related conservation of information is realized if, in a process of conversion of a portion of information I to a different form I’, a certain measure of information, e.g., S, remains the same, i.e., S(I) = S(I’). For instance, if I is the structural information of a macroscopic system at some time t, and I’ is the structural information of that system at some later time t’, and entropy S is the chosen measure of information, then information is conserved with respect to S in reversible processes. Similarly, it is possible to understand conservation of mass, spin, and electric and magnetic charge as the conservation of structural information [63].
- (ii)
- Transformation-related conservation of information is realized if in a process of conversion of the form F of a portion of information I to a different form F’, while both F and F’ have the capacity to cause the same changes in a system R. According to the general theory of information both F and F’ represent the same information. For instance, if I is the symbolic information in a textbook F written in a language L and F’ is a contextual translation of the textbook F to another language, then for a student R the information is transformation-relatedly conserved if the translation is correct and R knows both languages to the same extent. The translation may be formally written as the operation T: F→F’. It is possible to interpret conservation as existence of the inverse operator T
^{−1}, a backward translation, such that F = T^{−1}(F’) and all translations T form a mathematical group. Similarly, structural information in quantum mechanics is subject to conservation in processes governed by a group of unitary evolution operators [28].

## 7. Conclusions

## Author Contributions

## Conflicts of Interest

## References

- Burgin, M. Structural Reality; Nova Science Publishers: New York, NY, USA, 2012. [Google Scholar]
- Ladyman, J. Structural Realism, Stanford Encyclopedia of Philosophy, 2009. Available online: http://plato.stanford.edu/entries/structural-realism/ (accessed on 1 November 2017).
- Worrall, J. Structural Realism: The Best of Both Worlds? Dialectica
**1989**, 43, 99–124. [Google Scholar] [CrossRef] - Ebeling, W.; Feistel, R. Selforganization of Symbols and Information. In Chaos, Information Processing and Paradoxical Games: The Legacy of John S Nicolis; Nicolis, G., Basios, V., Eds.; World Scientific Pub Co.: Singapore, 2015; pp. 141–184. [Google Scholar]
- Feistel, R. Self-organisation of Symbolic Information. Eur. Phys. J. Spec. Top.
**2017**, 226, 207–228. [Google Scholar] [CrossRef] - Feistel, R. Emergence of Symbolic Information by the Ritualisation Transition. In Information Studies and the Quest for Transdisciplinarity; Burgin, M., Hofkirchner, W., Eds.; World Scientific: Singapore, 2017; pp. 115–164. [Google Scholar]
- Feistel, R.; Ebeling, W. Physics of Self-Organization and Evolution; Wiley-VCH: Weinheim, Germany, 2011; p. 517. [Google Scholar]
- Feistel, R.; Ebeling, W. Entropy and the Self-Organization of Information and Value. Entropy
**2016**, 18, 193. [Google Scholar] [CrossRef] - Ebeling, W.; Feistel, R. Chaos und Kosmos: Prinzipien der Evolution; Spektrum: Heidelberg, Germany, 1994. (In German) [Google Scholar]
- Brillouin, L. Science and Information Theory; Academic Press: New York, NY, USA, 1956. [Google Scholar]
- Klimontovich, Y.L. Turbulent Motion, the Structure of Chaos; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1991. [Google Scholar]
- Schrödinger, E. What Is Life—The Physical Aspect of the Living Cell; Cambridge University Press: Cambridge, UK, 1944. [Google Scholar]
- Lane, N. The Vital Question: Why Is Life the Way It Is? Profile Books: London, UK, 2015. [Google Scholar]
- Pattee, H.H. The Physics of Symbols: Bridging the Epistemic Cut. Biosystems
**2001**, 60, 5–21. [Google Scholar] [CrossRef] - Pearson, K. The Grammar of Science, 2nd ed.; Adam and Charles Black: London, UK, 1900. [Google Scholar]
- Eigen, M. From Strange Simplicity to Complex Familiarity; Oxford University Press: Oxford, UK, 2013. [Google Scholar]
- Boyajian, T.S.; LaCourse, D.M.; Rappaport, S.A.; Fabrycky, D.; Fischer, D.A.; Gandolfi, D.; Kennedy, G.M.; Korhonen, H.; Liu, M.C.; Olah, A.M.K.; et al. Planet Hunters IX, KIC 8462852: Where’s the flux? Mon. Not. R. Astron. Soc.
**2016**, 457, 3988–4004. [Google Scholar] [CrossRef] - Burgin, M. Theory of Information: Fundamentality, Diversity and Unification; World Scientific: New York, NY, USA, 2010. [Google Scholar]
- Burgin, M. Information in the Structure of the World. Inf. Theor. Appl.
**2011**, 18, 16–32. [Google Scholar] - Burgin, M. Epistemic Information in Stratified M-Spaces. Information
**2011**, 2, 697–726. [Google Scholar] [CrossRef] - Burgin, M. Weighted E-Spaces and Epistemic Information Operators. Information
**2014**, 5, 357–388. [Google Scholar] [CrossRef] - Burgin, M. Nonlinear phenomena in spaces of algorithms. Int. J. Comput. Math.
**2003**, 80, 1449–1476. [Google Scholar] [CrossRef] - Burgin, M. Data, Information, and Knowledge. Information
**2004**, 7, 47–57. [Google Scholar] - Capurro, R.; Fleissner, P.; Hofkirchner, W. Is a Unified Theory of Information Feasible? In The Quest for a Unified Theory of Information: Proceedings of the Second International Conference on the Foundations of Information Science; Hofkirchner, W., Ed.; Gordon and Breach Publ.: Amsterdam, The Netherlands, 1999; pp. 9–30. [Google Scholar]
- Melik-Gaikazyan, I.V. Information Processes and Reality; Nauka: Moscow, Russia, 1997. [Google Scholar]
- Bates, M.J. Information and knowledge: An evolutionary framework for information science. Inf. Res.
**2005**, 10, 10–14. [Google Scholar] - Reading, A. The Biological Nature of Meaningful Information. Biol. Theory
**2006**, 1, 243–249. [Google Scholar] [CrossRef] - Giddings, S.B. Black holes, quantum information, and the foundations of physics. Phys. Today
**2013**, 66, 30. [Google Scholar] [CrossRef] - Hawking, S. The Universe in a Nutshell; Bantam Books: New York, NY, USA, 2001. [Google Scholar]
- Zhang, B.; Cai, Q.-Y.; Zhan, M.-S.; You, L. Information conservation is fundamental: Recovering the lost information in Hawking radiation. Int. J. Mod. Phys. D
**2013**, 22, 1341014. [Google Scholar] [CrossRef] - Planck, M. Theorie der Wärmestrahlung; Johann Ambrosius Barth: Leipzig, Germany, 1966. (In German) [Google Scholar]
- Gilbert, W. Origin of life: The RNA world. Nature
**1986**, 319, 618. [Google Scholar] [CrossRef] - Fuchs-Kittowski, K. Information und Biologie. Informationsentstehung als neue Kategorie für eine Theorie der Biologie. In Biochemie ein Katalysator der Biowissenschaften. Kolloquium der Leibniz-Sozietät am 20. Nov. 1997 zum 85. Geburtstag von Samuel Mitja Rapoport; Sitzungsberichte der Leibniz-Sozietät, Band 22; trafo-Verlag: Berlin, Germany, 1999. (In German) [Google Scholar]
- Monod, J. Chance and Necessity; Alfred A. Knopf: New York, NY, USA, 1971. [Google Scholar]
- Borges, J.L. The Library of Babel; Penguin: New York, NY, USA, 1998. [Google Scholar]
- Feistel, R. Ritualisation und die Selbstorganisation der Information. In Selbstorganisation—Jahrbuch für Komplexität in der Natur-, Sozial- und Geisteswissenschaften; Duncker & Humblot: Berlin, Germany, 1990; Volume 1, pp. 83–98. (In German) [Google Scholar]
- Ebeling, W. Ist Evolution vom Einfachen zum Komplexen gerichtet? Über Werte und Emergenz. Sitzungsberichte der Leibniz-Sozietät der Wissenschaften zu Berlin
**2016**, 125, 69–80. (In German) [Google Scholar] - Burgin, M.; Meissner, G. 1 + 1 = 3: Synergy Arithmetic in Economics. Appl. Math.
**2017**, 8, 133–144. [Google Scholar] [CrossRef] - Lawrence, C. Making 1 + 1 = 3: Improving sedation through drug synergy. Gastrointest. Endosc.
**2011**, 73, 215–217. [Google Scholar] - Trott, D. One Plus One Equals Three: A Masterclass in Creative Thinking; Macmillan: London, UK, 2015. [Google Scholar]
- Butterfield, J. Laws, causation and dynamics at different levels. Interface Focus
**2012**, 2, 101–114. [Google Scholar] [CrossRef] [PubMed] - Fuentes, M.A. Complexity and the Emergence of Physical Properties. Entropy
**2014**, 16, 4489–4496. [Google Scholar] [CrossRef] - Born, M. Physics in My Generation. In Symbol and Reality; Springer: New York, NY, USA, 1969; Chapter 13; p. 132. [Google Scholar]
- Endler, J.A. Some general comments on the evolution and design of animal communication systems. Philos. Trans. Biol. Sci.
**1993**, 340, 215–255. [Google Scholar] [CrossRef] [PubMed] - Pinker, S. How the Mind Works; Penguin Books: London, UK, 1997. [Google Scholar]
- Pearson, K. The Grammar of Science, 3rd ed.; Cosimo: New York, NY, USA, 2007. [Google Scholar]
- Shannon, C.E. The Mathematical Theory of Communication. Bell Syst. Tech. J.
**1948**, 27, 379–423, 623–656. [Google Scholar] [CrossRef] - Turing, A.M. Computing machinery and intelligence. Mind
**1950**, 59, 433–460. [Google Scholar] [CrossRef] - Gallistel, C.R. The Organization of Learning; The MIT Press: Cambridge, MA, USA, 1993. [Google Scholar]
- Burgin, M. Information Theory: A Multifaceted Model of Information. Entropy
**2003**, 5, 146–160. [Google Scholar] [CrossRef] - Burgin, M. Superrecursive Algorithms; Springer: New York, NY, USA, 2005. [Google Scholar]
- Catford, J.A. Linguistic Theory of Translation; Oxford University Press: Oxford, UK, 1965. [Google Scholar]
- Weisgerber, L. Grundzüge der Inhaltsbezogenen Grammatik; Pädagogischer Verlag Schwann: Düsseldorf, Germany, 1971. (In German) [Google Scholar]
- Burgin, M.; Schumann, J. Three Levels of the Symbolosphere. Semiotica
**2006**, 160, 185–202. [Google Scholar] [CrossRef] - Logan, R.K.; Schumann, J.H. The Symbolosphere, Conceptualiztion, Language and Neo-Dualism. Semiotica
**2005**, 155, 201–214. [Google Scholar] [CrossRef] - Schumann, J.H. The evolution of language: What evolved? In Proceedings of the Society for Pidgin and Creole Linguistics Summer Conference, University of Hawaii, Honolulu, HI, USA, 14–17 August 2003. [Google Scholar]
- Prum, R.O. The Evolution of Beauty: How Darwin’s Forgotten Theory of Mate Choice Shapes the Animal World—And Us; Doubleday: New York, NY, USA, 2017. [Google Scholar]
- Burgin, M. Theory of Knowledge: Structures and Processes; World Scientific: New York, NY, USA, 2016. [Google Scholar]
- Burgin, M. Ideas of Plato in the context of contemporary science and mathematics. Athens J. Humanit. Arts
**2017**, 4, 161–182. [Google Scholar] - Wolpert, D.H.; Macready, W.G. Coevolutionary free lunches. IEEE Trans. Evol. Comput.
**2005**, 9, 721–735. [Google Scholar] [CrossRef] - Leeson, P.T. Oracles. Ration. Soc.
**2014**, 26, 141–169. [Google Scholar] [CrossRef] - Burgin, M. Inaccessible Information and the Mathematical Theory of Oracles, in Information Studies and the Quest for Transdisciplinarity: Unity through Diversity; World Scientific: New York, NY, USA, 2017; pp. 59–114. [Google Scholar]
- Pappas, N.D. A New Approach to Information Loss (no) Problem for Black Holes. Int. J. Theor. Math. Phys.
**2012**, 2, 5–9. [Google Scholar] [CrossRef]

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Burgin, M.; Feistel, R.
Structural and Symbolic Information in the Context of the General Theory of Information. *Information* **2017**, *8*, 139.
https://doi.org/10.3390/info8040139

**AMA Style**

Burgin M, Feistel R.
Structural and Symbolic Information in the Context of the General Theory of Information. *Information*. 2017; 8(4):139.
https://doi.org/10.3390/info8040139

**Chicago/Turabian Style**

Burgin, Mark, and Rainer Feistel.
2017. "Structural and Symbolic Information in the Context of the General Theory of Information" *Information* 8, no. 4: 139.
https://doi.org/10.3390/info8040139