Next Article in Journal
User Education in Automated Driving: Owner’s Manual and Interactive Tutorial Support Mental Model Formation and Human-Automation Interaction
Previous Article in Journal
A Review of Polyglot Persistence in the Big Data World
Open AccessArticle

A Synergetic Theory of Information

Independent Researcher, 3-82 Papanina St., 620077 Ekaterinburg, Russia
Information 2019, 10(4), 142; https://doi.org/10.3390/info10040142
Received: 25 March 2019 / Revised: 30 March 2019 / Accepted: 11 April 2019 / Published: 16 April 2019
(This article belongs to the Section Information Theory and Methodology)
A new approach is presented to defining the amount of information, in which information is understood as the data about a finite set as a whole, whereas the average length of an integrative code of elements serves as a measure of information. In the framework of this approach, the formula for the syntropy of a reflection was obtained for the first time, that is, the information which two intersecting finite sets reflect (reproduce) about each other. Features of a reflection of discrete systems through a set of their parts are considered and it is shown that reproducible information about the system (the additive syntropy of reflection) and non-reproducible information (the entropy of reflection) are, respectively, measures of the structural order and the chaos. At that, the general classification of discrete systems is given by the ratio of the order and the chaos. Three information laws have been established: The law of conservation of the sum of chaos and order; the information law of reflection; and the law of conservation and transformation of information. An assessment of the structural organization and the level of development of discrete systems is presented. It is shown that various measures of information are structural characteristics of integrative codes of elements of discrete systems. A conclusion is made that, from the information-genetic positions, the synergetic approach to the definition of the quantity of information is primary in relation to the approaches of Hartley and Shannon. View Full-Text
Keywords: syntropy; entropy; chaos; order; amount of information; finite set; integrative code syntropy; entropy; chaos; order; amount of information; finite set; integrative code
Show Figures

Figure 1

MDPI and ACS Style

Vyatkin, V. A Synergetic Theory of Information. Information 2019, 10, 142.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop