On Symmetries and the Language of Information
AbstractMany writings on information mix information on a given system (IS), measurable information content of a given system (IM), and the (also measurable) information content that we communicate among us on a given system (IC). They belong to different levels and different aspects of information. The first (IS) involves everything that one possibly can, at least potentially, know about a system, but will never learn completely. The second (IM) contains quantitative data that one really learns about a system. The third (IC) relates rather to the language (including mathematical) by which we transmit information on the system to one another, rather than to the system itself. The information content of a system (IM —this is what we generally mean by information) may include all (relevant) data on each element of the system. However, we can reduce the quantity of information we need to mediate to each other (IC), if we refer to certain symmetry principles or natural laws which the elements of the given system correspond to. Instead of listing the data for all elements separately, even in a not very extreme case, we can give a short mathematical formula that informs about the data of the individual elements of the system. This abbreviated form of information delivery includes several conventions. These conventions are protocols that we have learnt before, and do not need to be repeated each time in the given community. These conventions include the knowledge that the scientific community accumulated earlier when discovered and formulated the symmetry principle or the law of nature, the language in which those regularities were discovered and formulated, for example, the symmetry principle or the law of nature, the language in which those regularities were formulated and then accepted by the community, and the mathematical marks and abbreviations that are known only for the members of the given scientific community. We do not need to repeat the rules of the convention each time, because the conveyed information includes them, and it is there in our minds behind our communicated data on the information content. I demonstrate this by using two examples, Kepler’s laws, and the law of correspondence between the DNA codons’ triplet structure and the individual amino acids which they encode. The information content of the language by which we communicate the obtained information cannot be identified with the information content of the system that we want to characterize, and moreover, it does not include all the possible information that we could potentially learn about the system. Symmetry principles and natural laws may reduce the information we need to communicate about a system, but we must keep in mind the conventions that we have learnt about the abbreviating mechanism of those principles, laws, and mathematical descriptions. View Full-Text
Share & Cite This Article
Darvas, G. On Symmetries and the Language of Information. Information 2011, 2, 455-459.
Darvas G. On Symmetries and the Language of Information. Information. 2011; 2(3):455-459.Chicago/Turabian Style
Darvas, György. 2011. "On Symmetries and the Language of Information." Information 2, no. 3: 455-459.