# Classifying Entropy Measures

## Abstract

**:**

## 1. Introduction

_{i}p

_{i}log p

_{i}= Σ

_{i}p

_{i}log (1/p

_{i})

_{n}will be a function of n non-negative random variables that add up to 1, and represent probabilities. H

_{n}acts on the n-tuple of values on the sample, (p

_{i})

_{i}

_{=1,2,...,n}.

_{n}(p

_{1}, p

_{2},…, p

_{n}) ≤ H

_{n}(1/n, 1/n,…,1/n)

_{n}(1/n, 1/n,…,1/n) < H

_{n}

_{+1}(1/(n + 1), 1/(n + 1),…,1/(n + 1))

_{i}

_{=1,2,...,n}S

_{i}, then H(S) = Σ

_{i}

_{=1,2,...,n}H(S

_{i}).

- (1)
- I(P) ≥ 0, i.e., information is a non-negative quantity;
- (2)
- I(1) = 0, i.e., if an event has probability 1, we get no information from the occurrence of the event;
- (3)
- If two independent events occur, the information we get from observing the events is the sum of both informations;
- (4)
- Information measure must be continuous, and also a monotonic function of the probability. So, slight changes in probability should result in slight changes in the information.

## 2. Graph Entropy

_{i})

_{i}

_{=1,2,…,n}, on the probabilistic space.

_{P}[∑

_{i}

_{=1,2,…,n}p

_{i}log p

_{i}]

_{2}; [((Card(E) − Card(V) + 1)/(i(v) − 1))]

“Gain in Entropy always means loss of Information, and nothing more”.[5]

“Information is just known Entropy. Entropy is just unknown Information”.[6]

## 3. Quantum Entropy

_{n}, i.e., the n-complete n-graph.

^{−1/2}L(G) Δ

^{−1/2}

_{G}, and it will be given by d

_{G}= ∑ d(v). The average degree of G is expressed as d

_{G}

^{∗}= m ∑ d(v), where m is the number of non-isolated nodes.

_{G}= ((L(G))/(d

_{G})) = ((L (G))/(tr(Δ(G)))) = ((L(G))/(m d

_{G}

^{∗}))

_{2}ρ).

_{G}) is the QE of G.

_{G}, respectively given by

_{1}≥ λ

_{2}≥ ... ≥ λ

_{n}= 0, and μ

_{1}≥ μ

_{2}≥ ... ≥ μ

_{n}= 0

_{i}= ((λ

_{i})/(d

_{G})) = ((λ

_{i}})/(m d

_{G}

^{∗}))

_{G}can also be written as

_{i}log

_{2}μ

_{i}

## 4. Algorithmic Entropy

^{∗}= {0,1}

^{∗}, being ordered lexicographically. The length of a string x is denoted by |x|.

^{∗}= {0,1}

^{∗}, the Algorithmic Entropy of x will be defined by

_{p}{|p|:U(p) = x}

^{t}(x) = min

_{p}{|p|:U(p) = x, in at most t(|x|) steps}

^{t}(x) ≤ |x| + O(1),

^{t}(x/y) ≤ K

^{t}(x) + O(1)

## 5. Metric Entropy

^{X}be the set of mappings from X to the closed unit interval, I = [0,1].

^{X}satisfying that

- (1)
- 1 ∈ Σ;
- (2)
- If α ∈ Σ, then 1 – α ∈ Σ;
- (3)
- If {α
_{i}} is a sequence in Σ, then ∨ α_{i}= sup_{i}∈ Σ;

- [1]
- m (1) = 1
- [2]
- for all α ∈ Σ, m(1 − α) = 1 − m(α)
- [3]
- for all α, β ∈ Σ, m(α ∨ β) + m(α ∧ β) = m (α) + m (β)
- [4]
- If {α
_{i}} is a sequence in Σ, such that α_{i}↑ α, being α ∈ Σ, then m(α) = sup {m(α_{i})}

_{μ}(℘) = ∑

_{p}

_{∈℘}− μ(p) log μ(p)

_{μ}(T, ℘) = lim

_{n→∞}H

_{μ}(∨

^{−k}℘)

_{μ}the entropy of a partition, and where ∨ denotes the join of partitions. Such a limit always exists.

_{μ}(T) = sup

_{℘}h

_{μ}(T,℘)

_{μ}(T) is named the Metric Entropy of T. So, we may differentiate this mathematical object from the well-known as Topological Entropy.

^{∗}⊆ X is an ε-cover of Y, if for each y ∈ Y, there exists a y

^{∗}∈ Y

^{∗}such that d (y, y

^{∗}) ≤ ε. It is clear that there are many different covers of Y. But we are especially interested here in one which contains the lesser number of elements.

^{∗}):Y

^{∗}is an ε-cover}

^{∗}⊆ Y. And a proper covering number is defined in terms of the cardinality of the minimum proper cover. Both, covering numbers and proper covering numbers, are related by

_{proper}(ε,Y) ≤ N((ε/2),Y)

## 6. Topological Entropy

_{n}, by

_{n}(x, y) = max{d(f

^{i}(x), f

^{i}(y)): 0 ≤ i <n}

^{i}, i = 1, 2,…) are close.

_{n}(x, y) > ε. Denote by N(n, ε) the maximum cardinality of a (n, ε)-separated set. It must be finite, because X is compact. In general, this limit may exist, but it could be infinite. A possible interpretation of this number is as a measure of the average exponential growth of the number of distinguishable orbit segments. So, we could say that the higher the topological entropy is, the more essentially different orbits we have [2,7].

_{top}= lim

_{ε→0}lim sup

_{n→∞}[(1/n) log N(n, ε)]

## 7. Chromatic Entropy

_{i}log p

_{i}

_{2}[((Card (E) − Card (V) + 1)/(i(v) − 1))]

## 8. Mutual Relationship between Entropies

_{μ}(T)}

_{μ}

_{∈ P(X)}

_{μ}(T), with μ belonging to the set of all T-invariant Borel probability measures on X.

_{2}P(x)

_{α}(X) = (1/(1 − α)) log

_{2}(∑P(x)

^{α})

_{α→1}{(1/(1 − α)) log

_{2}(∑P(x)

^{α})} = −∑P(x) log

_{2}P(x)

_{α→1}H

_{α}(X) = H(X)

## 9. Graph Symmetry

^{g}∼ v

^{g}

## 10. Symmetry as Invariance

## 11. Fuzzy Entropies

## 12. About Negentropy

_{i}) log P(Y = y

_{i})

_{i}are the possible values of Y.

_{gauss}) − H(y)

_{gauss}a Gaussian random variable of the same covariance matrix as y.

“Negentropy of a living system is the entropy that it exports, to maintain its own entropy low”

“A living system imports negentropy, and stores it”

## 13. Conclusions

## Acknowledgements

## References

- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J.
**1948**, 27, 379–423, 623–656. [Google Scholar] [CrossRef] - Wehrl, A. General properties of entropy. Rev. Mod. Phys.
**1978**, 50, 221–260. [Google Scholar] [CrossRef] - Yager, R. On the Measure of Fuzziness and Negation. Int. J. General Syst.
**1979**, 5, 221–229. [Google Scholar] [CrossRef] - Higashi, M.; Klir, G.J. Measures of Uncertainty and Information based on possibility distributions. Int. J. General Syst.
**1982**, 9, 43–58. [Google Scholar] [CrossRef] - Lewis, G.N. The Entropy of Radiation. Proc. Natl. Acad. Sci. USA
**1927**, 13, 307–314. [Google Scholar] [CrossRef] [PubMed] - Frank, M.P. Approaching Physical Limits of Computing. Multiple-Valued Logic
**2005**. [Google Scholar] [CrossRef] - Simonyi, G. Graph Entropy: A Survey. DIMACS
**1995**, 20, 399–441. [Google Scholar] - Sinai, G. On the concept of Entropy of a Dynamical System. Dokl. Akad. Nauk SSSR
**1959**, 124, 768–771. [Google Scholar] - Passarini, F.; Severini, S. The von Neumann Entropy of Networks; University of Munich: Munich, Germany, 2009. [Google Scholar]
- Volkenstein, M.V. Entropy and Information (Progress in Mathematical Physics); Birkhäuser Verlag: Berlin, Germany, 2009; Volume 57. [Google Scholar]
- Devine, S. The insights of algorithmic entropy. Entropy
**2009**, 11, 85–110. [Google Scholar] [CrossRef] - Dehmer, M. Information processing in Complex Networks: Graph entropy and Information functionals. Appl. Math. Comput.
**2008**, 201, 82–94. [Google Scholar] [CrossRef] - Jozsa, R. Quantum Information and Its Properties. In Introduction to Quantum Computation and Information; Lo, H.K., Popescu, S., Spiller, T., Eds.; World Scientific: Singapore, 1998. [Google Scholar]
- Titchener, M.R.; Nicolescu, R.; Staiger, L.; Gulliver, A.; Speidel, U. Deterministic Complexity and Entropy. J. Fundam. Inf.
**2004**, 64. [Google Scholar] - Titchener, M.R. A Measure of Information. In Proceedings of the Data Compression Conference 2000, Snowbird, UT, USA, 2000; pp. 353–362. [Google Scholar]
- Titchener, M.R. A Deterministic Theory of Complexity, Information and Entropy. In Proceedings of the IEEE Information Theory Workshop, San Diego, CA, USA, February 1998. [Google Scholar]
- Preda, V.; Balcau, C. Entropy Optimization with Applications; Editura Academiei Romana: Bucureşti, România, 2010. [Google Scholar]
- Li, M.; Vitányi, P. An Introduction to Kolmogorov Complexity and Its Applications, 3rd ed.; Springer Verlag: Berlin, Germany, 2008. [Google Scholar]
- Dumitrescu, D. Entropy of a fuzzy process. Fuzzy Sets Syst.
**1993**, 55, 169–177. [Google Scholar] [CrossRef] - Wang, Z.; Klir, G.J. Generalized Measure Theory; Springer Verlag: Berlin, Germany and New York, NY, USA, 2008. [Google Scholar]
- Garrido, A.; Postolica, V. Modern Optimization; Editura Matrix-Rom: Bucuresti, Romania, 2011. [Google Scholar]
- Rényi, A. On measures of information and entropy. In Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability, Berkeley, CA, USA, 20 June–30 July 1960; University of California Press: Berkeley, CA, USA; pp. 547–561. [Google Scholar]
- Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev.
**1956**, 108, 171–190. [Google Scholar] [CrossRef] - Georgescu-Roegen, N. The Entropy Law and the Economic Process; Harvard University Press: Cambridge, MA, USA, 1971. [Google Scholar]
- Liu, X. Entropy, distance and similarity measures of fuzzy sets and their relations. Fuzzy Sets Syst.
**1992**, 52, 305–318. [Google Scholar] - De Luca, A.; Termini, S. A definition of non-probabilistic entropy, in the setting of fuzzy theory. Inf. Control
**1972**, 20, 301–312. [Google Scholar] [CrossRef] - You, C.; Gao, X. Maximum entropy membership functions for discrete fuzzy variables. Inf. Sci.
**2009**, 179, 2353–2361. [Google Scholar] - Dumitrescu, D. Fuzzy measures and the entropy of fuzzy partitions. J. Math. Anal. Appl.
**1993**, 176, 359–373. [Google Scholar] [CrossRef]

© 2011 by the author; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).

## Share and Cite

**MDPI and ACS Style**

Garrido, A.
Classifying Entropy Measures. *Symmetry* **2011**, *3*, 487-502.
https://doi.org/10.3390/sym3030487

**AMA Style**

Garrido A.
Classifying Entropy Measures. *Symmetry*. 2011; 3(3):487-502.
https://doi.org/10.3390/sym3030487

**Chicago/Turabian Style**

Garrido, Angel.
2011. "Classifying Entropy Measures" *Symmetry* 3, no. 3: 487-502.
https://doi.org/10.3390/sym3030487