# The Entropy Universe

^{1}

^{2}

^{3}

^{4}

^{5}

^{6}

^{7}

^{8}

^{*}

## Abstract

**:**

## 1. Introduction

- How the different concepts of entropy arose.
- The mathematical definitions of each entropy.
- How the entropies are related to each other.
- Which are the areas of application of each entropy and their impact in the scientific community.

## 2. Building the Universe of Entropies

#### 2.1. Early Times of the Entropy Concept

#### 2.2. Entropies Derived from Shannon Entropy

#### 2.2.1. Differential Entropy

#### 2.2.2. Spectral Entropy

#### 2.2.3. Tone-Entropy

#### 2.2.4. Wavelet Entropy

#### 2.2.5. Empirical Mode Decomposition Energy Entropy

- Calculate ${E}_{i}$ energy for each ith IMFs ${c}_{i}$:$${E}_{i}=\sum _{j=1}^{m}{\left(\right)}^{{c}_{i}}2$$
- Calculate the total energy of these n efficient IMFs:$$E=\sum _{i=1}^{n}{E}_{i}.$$
- Calculate the energy entropy of IMF:$${H}_{en}=-\sum _{j=1}^{n}{p}_{i}\xb7log\left(\right)open="("\; close=")">{p}_{i}$$

#### 2.2.6. $\Delta -$Entropy

#### 2.3. Kolmogorov, Topological and Geometric Entropies

#### 2.4. Rényi Entropy

#### 2.4.1. Particular Cases of Rényi Entropy

#### 2.4.2. $\u03f5-$Smooth Rényi Entropy

#### 2.4.3. Rényi entropy for Continuous Random Variables and the Different Definition of Quadratic Entropy

#### 2.5. Havrda–Charvát Structural $\alpha -$Entropy and Tsallis Entropy

#### 2.6. Permutation Entropy and Related Entropies

#### 2.7. Rank-Based Entropy and Bubble Entropy

- Compute, for $1\le i<j\le N-m$, the mutual distances vectors: ${d}_{k(i,j)}={\left(\right)}_{{v}_{m,i}}\infty $ and ${d}_{k(i,j)}^{{}^{\prime}}=\left(\right)open="|"\; close="|">{x}_{i+m}-{x}_{j+m}$ where ${\left(\right)}_{{v}_{m,i}}$ is the infinity norm of vector ${v}_{m,i}=\{{x}_{i},{x}_{i+1},\dots ,{x}_{i+m-1}\}$ that is, ${\left(\right)}_{{v}_{m,i}}\infty $ and $k=k(i,j)$ is the index assigned for each $(i,j)$ pair, with $1\le k\le K=(N-m-1)(N-m)/2$.
- Consider vector ${d}_{k}$ and find the permutation $\pi \left(k\right)$ such that the vector ${S}_{k}={d}_{\pi \left(k\right)}$ is sorted in ascending order. Now, if the system is deterministic, we expect that if the vectors ${v}_{m,i}$ and ${v}_{m,j}$ are close, then the new observation from each vector ${x}_{i+m},{x}_{j+m}$ should be close too. In other words, ${S}_{k}^{{}^{\prime}}={d}_{\pi \left(k\right)}^{{}^{\prime}}$ should be almost sorted too. Compute inversion count which is a measure of a vector‘s disorder.
- Determine the largest index ${k}_{\rho}$ satisfying ${S}_{{k}_{\rho}}<\rho $ and compute the number I of inversion pairs $({k}_{1},{k}_{2})$ such that ${k}_{1}<{k}_{\rho}$, ${k}_{1}<{k}_{2}$ and ${S}_{{k}_{1}}^{{}^{\prime}}>{S}_{{k}_{2}}^{{}^{\prime}}$.
- Compute the RbE as:$$RbE=-ln\left(\right)open="("\; close=")">1-\frac{I}{\left(\right)open="("\; close=")">2K-{k}_{\rho}-1{k}_{\rho}}.$$

- Compute an histogram from ${n}_{i}$ values and normalize it by $N-m$, to obtain the probabilities ${p}_{i}$ (describing how likely a given number of swaps ${n}_{i}$ is).
- Repeat steps 1 to 3 to compute $RP{E}_{swaps}^{m}$.
- Compute BEn by:$$BEn=\frac{\left(\right)}{R}lo{g}_{2}\left(\right)open="("\; close=")">\frac{m}{m-2}.$$

#### 2.8. Topological Information Content, Graph Entropy and Horizontal Visibility Graph Entropy

#### 2.9. Approximate and Sample Entropies

#### 2.9.1. Quadratic Sample Entropy, Coefficient of Sample Entropy and Intrinsic Mode Entropy

#### 2.9.2. Dispersion Entropy and Fluctuation-Based Dispersion Entropy

- First, ${x}_{j}(j=1,2,\dots ,N)$ are mapped to c classes, labeled from 1 to c. The classified signal is ${u}_{j}(j=1,2,\dots ,N)$. To do so, there are a number of linear and nonlinear mapping techniques. For more details see [116].
- Each embedding vector ${U}_{m}^{\tau ,c}\left(i\right)$ with m embedding dimension and $\tau $ time delay is created according to ${U}_{m}^{\tau ,c}\left(i\right)=({u}_{i}^{c},{u}_{i+\tau}^{c},{u}_{i+2\tau}^{c},\dots ,{u}_{i+(m-1)\tau}^{c})$ with $i=1,\dots ,N-(m-1)\tau $. Each time-series ${U}_{m}^{\tau ,c}\left(i\right)$ is mapped to a dispersion pattern ${\pi}_{{v}_{0}{v}_{1}\dots {v}_{m-1}}$, where ${u}_{i}^{c}={v}_{0}$, ${u}_{i+\tau}^{c}={v}_{1}$, ..., ${u}_{i+(m-1)\tau}^{c}={v}_{m-1}$. The number of possible dispersion patterns that can be assigned to each time-series ${U}_{m}^{\tau ,c}\left(i\right)$ is equal to ${c}^{m}$, since the signal has m members and each member can be one of the integers from 1 to c [18].
- For each ${c}^{m}$ potential dispersion patterns ${\pi}_{{v}_{0}{v}_{1}\dots {v}_{m-1}}$, their relative frequency is obtained as follows:$$p\left({\pi}_{{v}_{0}{v}_{1}\dots {v}_{m-1}}\right)=\frac{\#\left(\right)open="\{"\; close="\}">i|i\le N-(m-1)\tau ,\phantom{\rule{4.pt}{0ex}}\mathrm{has}\phantom{\rule{4.pt}{0ex}}\mathrm{type}\phantom{\rule{4.pt}{0ex}}{\pi}_{{v}_{0}{v}_{1}\dots {v}_{m-1}}}{}N-(m-1)\tau $$
- Finally, the DispEn value is calculated, based on the SE definition of entropy, as follows:$$DispEn(X,m,c,\tau )=-\sum _{\pi =1}^{{c}^{m}}p\left({\pi}_{{v}_{0}{v}_{1}\dots {v}_{m-1}}\right)\xb7ln\left(\right)open="("\; close=")">p\left({\pi}_{{v}_{0}{v}_{1}\dots {v}_{m-1}}\right)$$

#### 2.9.3. Fuzzy Entropy

#### 2.9.4. Modified Sample Entropy

#### 2.9.5. Fuzzy Measure Entropy

#### 2.9.6. Kernel Entropies

#### 2.10. Multiscale Entropy

## 3. The Entropy Universe Discussion

## 4. Entropy Impact in the Scientific Community

#### 4.1. Number of Citations

#### 4.2. Areas of Application

## 5. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## Abbreviations

ApEn | approximate entropy |

BEn | bubble entropy |

CauchyKE | Cauchy kernel entropy |

CKE | circular kernel entropy |

CosEn | coefficient of sample entropy |

DE | differential entropy |

DispEn | dispersion entropy |

EKE | exponential kernel entropy |

EMD | empirical mode decomposition |

EMDEnergyEn | empirical mode decomposition energy entropy |

FDispEn | fluctuation-based dispersion entropy |

FuzzyEn | fuzzy entropy |

FuzzyMEn | fuzzy measure entropy |

i.i.d. | independent and identically distributed |

IME | intrinsic mode entropy |

IMF | intrinsic mode functions |

InMDEn | intrinsic mode dispersion entropy |

KbEn | kernel-based entropy |

LKE | Laplacian kernel entropy |

mSampEn | modified sample entropy |

NTPE | normalized Tsallis permutation entropy |

PE | permutation entropy |

QSE | quadratic sample entropy |

RbE | rank-based entropy |

RE | Rényi entropy |

RPE | Rényi permutation entropy |

SampEn | sample entropy |

SE | shannon entropy |

SKE | spherical kernel entropy |

SortEn | sorting entropy |

SpEn | spectral entropy |

TE | Tsallis entropy |

T-E | tone-entropy |

TKE | triangular kernel entropy |

TopEn | topological entropy |

WaEn | wavelet entropy |

WoS | web of science |

## References

- Flores Camacho, F.; Ulloa Lugo, N.; Covarrubias Martínez, H. The concept of entropy, from its origins to teachers. Rev. Mex. Física
**2015**, 61, 69–80. [Google Scholar] - Harris, H.H. Review of Entropy and the Second Law: Interpretation and Misss-Interpretationsss. J.Chem. Educ.
**2014**, 91, 310–311. [Google Scholar] [CrossRef] - Shaw, D.; Davis, C.H. Entropy and information: A multidisciplinary overview. J. Am. Soc. Inf. Sci.
**1983**, 34, 67–74. [Google Scholar] [CrossRef] - Kostic, M.M. The elusive nature of entropy and its physical meaning. Entropy
**2014**, 16, 953–967. [Google Scholar] [CrossRef] [Green Version] - Popovic, M. Researchers in an entropy wonderland: A review of the entropy concept. arXiv
**2017**, arXiv:1711.07326. [Google Scholar] - Batten, D.F. A review of entropy and information theory. In Spatial Analysis of Interacting Economies; Springer: Berlin/Heidelberg, Germany, 1983; pp. 15–52. [Google Scholar]
- Amigó, J.M.; Balogh, S.G.; Hernández, S. A brief review of generalized entropies. Entropy
**2018**, 20, 813. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Tempesta, P. Beyond the Shannon–Khinchin formulation: The composability axiom and the universal-group entropy. Ann. Phys.
**2016**, 365, 180–197. [Google Scholar] [CrossRef] [Green Version] - Namdari, A.; Li, Z. A review of entropy measures for uncertainty quantification of stochastic processes. Adv. Mech. Eng.
**2019**, 11. [Google Scholar] [CrossRef] - Rong, L.; Shang, P. Topological entropy and geometric entropy and their application to the horizontal visibility graph for financial time series. Nonlinear Dyn.
**2018**, 92, 41–58. [Google Scholar] [CrossRef] - Blanco, S.; Figliola, A.; Quiroga, R.Q.; Rosso, O.; Serrano, E. Time-frequency analysis of electroencephalogram series. III. Wavelet packets and information cost function. Phys. Rev. E
**1998**, 57, 932. [Google Scholar] [CrossRef] - Huang, N.E.; Shen, Z.; Long, S.R.; Wu, M.C.; Shih, H.H.; Zheng, Q.; Yen, N.C.; Tung, C.C.; Liu, H.H. The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. Proc. R. Soc. London. Ser. A Math. Phys. Eng. Sci.
**1998**, 454, 903–995. [Google Scholar] [CrossRef] - Bandt, C.; Pompe, B. Permutation entropy: A natural complexity measure for time series. Phys. Rev. Lett.
**2002**, 88, 174102. [Google Scholar] [CrossRef] [PubMed] - Zhao, X.; Shang, P.; Huang, J. Permutation complexity and dependence measures of time series. EPL Europhys. Lett.
**2013**, 102, 40005. [Google Scholar] [CrossRef] - Richman, J.S.; Moorman, J.R. Physiological time-series analysis using approximate entropy and sample entropy. Am. J. Physiol. Heart Circ. Physiol.
**2000**, 278, H2039–H2049. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Lake, D.E.; Moorman, J.R. Accurate estimation of entropy in very short physiological time series: The problem of atrial fibrillation detection in implanted ventricular devices. Am. J. Physiol. Heart Circ. Physiol.
**2011**, 300, H319–H325. [Google Scholar] [CrossRef] [PubMed] - Xie, H.B.; He, W.X.; Liu, H. Measuring time series regularity using nonlinear similarity-based sample entropy. Phys. Lett. A
**2008**, 372, 7140–7146. [Google Scholar] [CrossRef] - Rostaghi, M.; Azami, H. Dispersion entropy: A measure for time-series analysis. IEEE Signal Process. Lett.
**2016**, 23, 610–614. [Google Scholar] [CrossRef] - Xu, L.S.; Wang, K.Q.; Wang, L. Gaussian kernel approximate entropy algorithm for analyzing irregularity of time-series. In Proceedings of the 2005 international conference on machine learning and cybernetics, Guangzhou, China, 15–21 August 2005; Volume 9, pp. 5605–5608. [Google Scholar]
- Martin, J.S.; Smith, N.A.; Francis, C.D. Removing the entropy from the definition of entropy: Clarifying the relationship between evolution, entropy, and the second law of thermodynamics. Evol. Educ. Outreach
**2013**, 6, 30. [Google Scholar] [CrossRef] - Chakrabarti, C.; De, K. Boltzmann-Gibbs entropy: Axiomatic characterization and application. Int. J. Math. Math. Sci.
**2000**, 23, 243–251. [Google Scholar] [CrossRef] [Green Version] - Haubold, H.; Mathai, A.; Saxena, R. Boltzmann-Gibbs entropy versus Tsallis entropy: Recent contributions to resolving the argument of Einstein concerning “Neither Herr Boltzmann nor Herr Planck has given a definition of W”? Astrophys. Space Sci.
**2004**, 290, 241–245. [Google Scholar] [CrossRef] [Green Version] - Cariolaro, G. Classical and Quantum Information Theory. In Quantum Communications; Springer: Berlin/Heidelberg, Germany, 2015; pp. 573–637. [Google Scholar]
- Lindley, D.; O’Connell, J. Boltzmann’s atom: The great debate that launched a revolution in physics. Am. J. Phys.
**2001**, 69, 1020. [Google Scholar] [CrossRef] - Planck, M. On the theory of the energy distribution law of the normal spectrum. Verh. Deut. Phys. Ges.
**1900**, 2, 237–245. [Google Scholar] - Gibbs, J.W. Elementary Principles in Statistical Mechanics: Developed with Especial Reference to the Rational Foundation of Thermodynamics; C. Scribner’s Sons: Farmington Hills, MI, USA, 1902. [Google Scholar]
- Rondoni, L.; Cohen, E. Gibbs entropy and irreversible thermodynamics. Nonlinearity
**2000**, 13, 1905. [Google Scholar] [CrossRef] [Green Version] - Goldstein, S.; Lebowitz, J.L.; Tumulka, R.; Zanghi, N. Gibbs and Boltzmann entropy in classical and quantum mechanics. arXiv
**2019**, arXiv:1903.11870. [Google Scholar] - Hartley, R.V. Transmission of information 1. Bell Syst. Tech. J.
**1928**, 7, 535–563. [Google Scholar] [CrossRef] - Von Neumann, J. Mathematische Grundlagen der Quantenmechanik; Springer: Berlin/Heidelberg, Germany, 1932. [Google Scholar]
- Legeza, Ö.; Sólyom, J. Optimizing the density-matrix renormalization group method using quantum information entropy. Phys. Rev. B
**2003**, 68, 195116. [Google Scholar] [CrossRef] - Coles, P.J.; Berta, M.; Tomamichel, M.; Wehner, S. Entropic uncertainty relations and their applications. Rev. Mod. Phys.
**2017**, 89, 015002. [Google Scholar] [CrossRef] [Green Version] - Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J.
**1948**, 27, 379–423. [Google Scholar] [CrossRef] [Green Version] - Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1949; pp. 1–117. [Google Scholar]
- Weaver, W. Recent contributions to the mathematical theory of communication. ETC Rev. Gen. Semant.
**1953**, 10, 261–281. [Google Scholar] - Rioul, O.; LTCI, T.P. This is it: A primer on Shannon’s entropy and information. L’Information, Semin. Poincare
**2018**, 23, 43–77. [Google Scholar] - Kline, R.R. The Cybernetics Moment: Or Why We Call Our Age the Information Age; JHU Press: Baltimore, MA, USA, 2015. [Google Scholar]
- Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; Number pt. 11 in Illini books; University of Illinois Press: Urbana, IL, USA, 1963. [Google Scholar]
- Smith, J.D. Some observations on the concepts of information-theoretic entropy and randomness. Entropy
**2001**, 3, 1–11. [Google Scholar] [CrossRef] - Ochs, W. Basic properties of the generalized Boltzmann-Gibbs-Shannon entropy. Rep. Math. Phys.
**1976**, 9, 135–155. [Google Scholar] [CrossRef] - Plastino, A.; Plastino, A.; Vucetich, H. A quantitative test of Gibbs’ statistical mechanics. Phys. Lett. A
**1995**, 207, 42–46. [Google Scholar] [CrossRef] - Stratonovich, R. The entropy of systems with a random number of particles. Sov. Phys. JETP-USSR
**1955**, 1, 254–261. [Google Scholar] - Khinchin, A.Y. Mathematical Foundations of Information Theory; Courier Corporation: Dover, NY, USA, 2013. [Google Scholar]
- Cover, T.M.; Thomas, J.A. Elem. Inf. Theory; John Wiley & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
- Aczél, J.; Daróczy, Z. On Measures of Information and Their Characterizations; Academic Press: New York, NY, USA, 1975; p. 168. [Google Scholar]
- Kullback, S. Information Theory and Statistics; Courier Corporation: Dover, New York, USA, 1997. [Google Scholar]
- Chakrabarti, C.; Chakrabarty, I. Shannon entropy: Axiomatic characterization and application. Int. J. Math. Math. Sci.
**2005**, 2005, 2847–2854. [Google Scholar] [CrossRef] [Green Version] - Marsh, C. Introduction to Continuous Entropy; Department of Computer Science, Princeton University: Princeton, NJ, USA, 2013. [Google Scholar]
- Kapur, J.N.; Kesavan, H.K. Entropy optimization principles and their applications. In Entropy and Energy Dissipation in Water Resources; Springer: Berlin/Heidelberg, Germany, 1992; pp. 3–20. [Google Scholar]
- Borowska, M. Entropy-based algorithms in the analysis of biomedical signals. Stud. Logic Gramm. Rhetor.
**2015**, 43, 21–32. [Google Scholar] [CrossRef] [Green Version] - Oida, E.; Moritani, T.; Yamori, Y. Tone-entropy analysis on cardiac recovery after dynamic exercise. J. Appl. Physiol.
**1997**, 82, 1794–1801. [Google Scholar] [CrossRef] [PubMed] - Rosso, O.A.; Blanco, S.; Yordanova, J.; Kolev, V.; Figliola, A.; Schürmann, M.; Başar, E. Wavelet entropy: A new tool for analysis of short duration brain electrical signals. J. Neurosci. Methods
**2001**, 105, 65–75. [Google Scholar] [CrossRef] - Yu, Y.; Junsheng, C. A roller bearing fault diagnosis method based on EMD energy entropy and ANN. J. Sound Vib.
**2006**, 294, 269–277. [Google Scholar] [CrossRef] - Chen, B.; Zhu, Y.; Hu, J.; Prı, J.C. Δ-Entropy: Definition, properties and applications in system identification with quantized data. Inf. Sci.
**2011**, 181, 1384–1402. [Google Scholar] [CrossRef] - Kolmogorov, A.N. A New Metric Invariant of Transient Dynamical Systems and Automorphisms in Lebesgue Spaces; Doklady Akademii Nauk; Russian Academy of Sciences: Moscow, Russai, 1958; Volume 119, pp. 861–864. [Google Scholar]
- Wong, K.S.; Salleh, Z. A note on the notions of topological entropy. Earthline J. Math. Sci.
**2018**, 1–16. [Google Scholar] [CrossRef] - Sinai, I. On the concept of entropy for a dynamic system. Dokl. Akad. Nauk. SSSR
**1959**, 124, 768–771. [Google Scholar] - Farmer, J.D. Information dimension and the probabilistic structure of chaos. Z. Naturforschung A
**1982**, 37, 1304–1326. [Google Scholar] [CrossRef] - Frigg, R. In what sense is the Kolmogorov-Sinai entropy a measure for chaotic behaviour?—bridging the gap between dynamical systems theory and communication theory. Br. J. Philos. Sci.
**2004**, 55, 411–434. [Google Scholar] [CrossRef] [Green Version] - Costa, M.; Goldberger, A.L.; Peng, C.K. Multiscale entropy analysis of biological signals. Phys. Rev. E
**2005**, 71, 021906. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Orozco-Arroyave, J.R.; Arias-Londono, J.D.; Vargas-Bonilla, J.F.; Nöth, E. Analysis of speech from people with Parkinson’s disease through nonlinear dynamics. In International Conference on Nonlinear Speech Processing; Springer: Berlin/Heidelberg, Germany, 2013; pp. 112–119. [Google Scholar]
- Zmeskal, O.; Dzik, P.; Vesely, M. Entropy of fractal systems. Comput. Math. Appl.
**2013**, 66, 135–146. [Google Scholar] [CrossRef] - Henriques, T.; Gonçalves, H.; Antunes, L.; Matias, M.; Bernardes, J.; Costa-Santos, C. Entropy and compression: Two measures of complexity. J. Eval. Clin. Pract.
**2013**, 19, 1101–1106. [Google Scholar] [CrossRef] - Eckmann, J.P.; Ruelle, D. Ergodic theory of chaos and strange attractors. In The Theory of Chaotic Attractors; Springer: Berlin/Heidelberg, Germany, 1985; pp. 273–312. [Google Scholar]
- Xiong, W.; Faes, L.; Ivanov, P.C. Entropy measures, entropy estimators, and their performance in quantifying complex dynamics: Effects of artifacts, nonstationarity, and long-range correlations. Phys. Rev. E
**2017**, 95, 062114. [Google Scholar] [CrossRef] [Green Version] - Adler, R.L.; Konheim, A.G.; McAndrew, M.H. Topological entropy. Trans. Am. Math. Soc.
**1965**, 114, 309–319. [Google Scholar] [CrossRef] - Feng, D.J.; Huang, W. Variational principles for topological entropies of subsets. J. Funct. Anal.
**2012**, 263, 2228–2254. [Google Scholar] [CrossRef] [Green Version] - Nilsson, J. On the entropy of a family of random substitutions. Monatshefte Math.
**2012**, 168, 563–577. [Google Scholar] [CrossRef] [Green Version] - Bowen, R. Entropy for group endomorphisms and homogeneous spaces. Trans. Am. Math. Soc.
**1971**, 153, 401–414. [Google Scholar] [CrossRef] - Cánovas, J.; Rodríguez, J. Topological entropy of maps on the real line. Topol. Appl.
**2005**, 153, 735–746. [Google Scholar] [CrossRef] [Green Version] - Bowen, R. Topological entropy for noncompact sets. Trans. Am. Math. Soc.
**1973**, 184, 125–136. [Google Scholar] [CrossRef] - Handel, M.; Kitchens, B.; Rudolph, D.J. Metrics and entropy for non-compact spaces. Isr. J. Math.
**1995**, 91, 253–271. [Google Scholar] [CrossRef] - Addabbo, R.; Blackmore, D. A dynamical systems-based hierarchy for Shannon, metric and topological entropy. Entropy
**2019**, 21, 938. [Google Scholar] [CrossRef] [Green Version] - Ghys, E.; Langevin, R.; Walczak, P. Entropie géométrique des feuilletages. Acta Math.
**1988**, 160, 105–142. [Google Scholar] [CrossRef] - Hurder, S. Entropy and Dynamics of C1 Foliations; University of Illinois: Chicago, IL, USA, 2020. [Google Scholar]
- Biś, A. Entropy of distributions. Topol. Appl.
**2005**, 152, 2–10. [Google Scholar] [CrossRef] [Green Version] - Hurder, S. Lectures on foliation dynamics: Barcelona 2010. arXiv
**2011**, arXiv:1104.4852. [Google Scholar] - Lacasa, L.; Luque, B.; Ballesteros, F.; Luque, J.; Nuno, J.C. From time series to complex networks: The visibility graph. Proc. Natl. Acad. Sci. USA
**2008**, 105, 4972–4975. [Google Scholar] [CrossRef] [Green Version] - Rényi, A. On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics; The Regents of the University of California: California, CA, USA, 1961. [Google Scholar]
- Bosyk, G.; Portesi, M.; Plastino, A. Collision entropy and optimal uncertainty. Phys. Rev. A
**2012**, 85, 012108. [Google Scholar] [CrossRef] [Green Version] - Easwaramoorthy, D.; Uthayakumar, R. Improved generalized fractal dimensions in the discrimination between healthy and epileptic EEG signals. J. Comput. Sci.
**2011**, 2, 31–38. [Google Scholar] [CrossRef] - Müller, M.P.; Pastena, M. A generalization of majorization that characterizes Shannon entropy. IEEE Trans. Inf. Theory
**2016**, 62, 1711–1720. [Google Scholar] [CrossRef] [Green Version] - Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev.
**1957**, 106, 620. [Google Scholar] [CrossRef] - Posner, E. Random coding strategies for minimum entropy. IEEE Trans. Inf. Theory
**1975**, 21, 388–391. [Google Scholar] [CrossRef] - Chevalier, C.; Fouque, P.A.; Pointcheval, D.; Zimmer, S. Optimal randomness extraction from a Diffie-Hellman element. In Annual International Conference on the Theory and Applications of Cryptographic Techniques; Springer: Berlin/Heidelberg, Germany, 2009; pp. 572–589. [Google Scholar]
- Renner, R.; Wolf, S. Smooth Rényi entropy and applications. In Proceedings of the International Symposium on Information Theory, Chicago, IL, USA, 27 June–2 July 2004; p. 233. [Google Scholar]
- Lake, D.E. Renyi entropy measures of heart rate Gaussianity. IEEE Trans. Biomed. Eng.
**2005**, 53, 21–27. [Google Scholar] [CrossRef] [PubMed] - Botta-Dukát, Z. Rao’s quadratic entropy as a measure of functional diversity based on multiple traits. J. Veg. Sci.
**2005**, 16, 533–540. [Google Scholar] [CrossRef] - Rao, C.R. Diversity and dissimilarity coefficients: A unified approach. Theor. Popul. Biol.
**1982**, 21, 24–43. [Google Scholar] [CrossRef] - Havrda, J.; Charvát, F. Quantification method of classification processes. Concept of structural a-entropy. Kybernetika
**1967**, 3, 30–35. [Google Scholar] - Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys.
**1988**, 52, 479–487. [Google Scholar] [CrossRef] - Zunino, L.; Pérez, D.; Kowalski, A.; Martín, M.; Garavaglia, M.; Plastino, A.; Rosso, O. Fractional Brownian motion, fractional Gaussian noise, and Tsallis permutation entropy. Phys. A Stat. Mech. Its Appl.
**2008**, 387, 6057–6068. [Google Scholar] [CrossRef] - Bandt, C. Ordinal time series analysis. Ecol. Model.
**2005**, 182, 229–238. [Google Scholar] [CrossRef] - Citi, L.; Guffanti, G.; Mainardi, L. Rank-based multi-scale entropy analysis of heart rate variability. In Proceedings of the Computing in Cardiology 2014, Cambridge, MA, USA, 7–10 September 2014; pp. 597–600. [Google Scholar]
- Manis, G.; Aktaruzzaman, M.; Sassi, R. Bubble entropy: An entropy almost free of parameters. IEEE Trans. Biomed. Eng.
**2017**, 64, 2711–2718. [Google Scholar] [PubMed] - Friend, E.H. Sorting on electronic computer systems. J. ACM (JACM)
**1956**, 3, 134–168. [Google Scholar] [CrossRef] - Astrachan, O. Bubble Sort: An Archaeological Algorithmic Analysis; ACM SIGCSE Bulletin; ACM: New York, NY, USA, 2003; Volume 35, pp. 1–5. [Google Scholar]
- Bodini, M.; Rivolta, M.W.; Manis, G.; Sassi, R. Analytical Formulation of Bubble Entropy for Autoregressive Processes. In Proceedings of the 2020 11th Conference of the European Study Group on Cardiovascular Oscillations (ESGCO), Pisa, Italy, 15 July 2020; pp. 1–2. [Google Scholar]
- Dehmer, M.; Mowshowitz, A. A history of graph entropy measures. Inf. Sci.
**2011**, 181, 57–78. [Google Scholar] [CrossRef] - Rashevsky, N. Life, information theory, and topology. Bull. Math. Biophys.
**1955**, 17, 229–235. [Google Scholar] [CrossRef] - Trucco, E. A note on the information content of graphs. Bull. Math. Biophys.
**1956**, 18, 129–135. [Google Scholar] [CrossRef] - Mowshowitz, A. Entropy and the complexity of graphs: I. An index of the relative complexity of a graph. Bull. Math. Biophys.
**1968**, 30, 175–204. [Google Scholar] [CrossRef] - Mowshowitz, A. Entropy and the complexity of graphs: II. The information content of digraphs and infinite graphs. Bull. Math. Biophys.
**1968**, 30, 225–240. [Google Scholar] [CrossRef] [PubMed] - Mowshowitz, A. Entropy and the complexity of graphs: III. Graphs with prescribed information content. Bull. Math. Biophys.
**1968**, 30, 387–414. [Google Scholar] [CrossRef] - Mowshowitz, A. Entropy and the complexity of graphs: IV. Entropy measures and graphical structure. Bull. Math. Biophys.
**1968**, 30, 533–546. [Google Scholar] [CrossRef] - Körner, J. Coding of an information source having ambiguous alphabet and the entropy of graphs. In Proceedings of the 6th Prague Conference on Information Theory, Prague, Czech Republic, 18–23 August 1973; pp. 411–425. [Google Scholar]
- Csiszár, I.; Körner, J.; Lovász, L.; Marton, K.; Simonyi, G. Entropy splitting for antiblocking corners and perfect graphs. Combinatorica
**1990**, 10, 27–40. [Google Scholar] [CrossRef] - Simonyi, G. Graph entropy: A survey. Comb. Optim.
**1995**, 20, 399–441. [Google Scholar] - Zhu, G.; Li, Y.; Wen, P.P.; Wang, S. Analysis of alcoholic EEG signals based on horizontal visibility graph entropy. Brain Informatics
**2014**, 1, 19–25. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Luque, B.; Lacasa, L.; Ballesteros, F.; Luque, J. Horizontal visibility graphs: Exact results for random time series. Phys. Rev. E
**2009**, 80, 046103. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Zhu, G.; Li, Y.; Wen, P.P. Epileptic seizure detection in EEGs signals using a fast weighted horizontal visibility algorithm. Comput. Methods Programs Biomed.
**2014**, 115, 64–75. [Google Scholar] [CrossRef] [PubMed] - Pincus, S.M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA
**1991**, 88, 2297–2301. [Google Scholar] [CrossRef] [Green Version] - Liang, Z.; Wang, Y.; Sun, X.; Li, D.; Voss, L.J.; Sleigh, J.W.; Hagihira, S.; Li, X. EEG entropy measures in anesthesia. Front. Comput. Neurosci.
**2015**, 9, 16. [Google Scholar] [CrossRef] [PubMed] - Pincus, S.M.; Gladstone, I.M.; Ehrenkranz, R.A. A regularity statistic for medical data analysis. J. Clin. Monit.
**1991**, 7, 335–345. [Google Scholar] [CrossRef] - Amoud, H.; Snoussi, H.; Hewson, D.; Doussot, M.; Duchene, J. Intrinsic mode entropy for nonlinear discriminant analysis. IEEE Signal Process. Lett.
**2007**, 14, 297–300. [Google Scholar] [CrossRef] - Azami, H.; Escudero, J. Amplitude-and fluctuation-based dispersion entropy. Entropy
**2018**, 20, 210. [Google Scholar] [CrossRef] [Green Version] - Zadeh, L.A. Fuzzy sets. Inf. Control
**1965**, 8, 338–353. [Google Scholar] [CrossRef] [Green Version] - De Luca, A.; Termini, S. A definition of a nonprobabilistic entropy in the setting of fuzzy sets theory. Inf. Control
**1972**, 20, 301–312. [Google Scholar] [CrossRef] [Green Version] - Parkash, C. Fuzzy and Non Fuzzy Measures of Information and Their Applications to Queueing Theory; Guru Nanak Dev University: Punjab, India, 2014. [Google Scholar]
- Chen, W.; Wang, Z.; Xie, H.; Yu, W. Characterization of surface EMG signal based on fuzzy entropy. IEEE Trans. Neural Syst. Rehabil. Eng.
**2007**, 15, 266–272. [Google Scholar] [CrossRef] - Yeniyayla, Y. Fuzzy Entropy and Its Application. Ph.D. Thesis, Dokuz Eylul University, Fen Bilimleri Enstitüsü, Izmir, Turkey, 2011. [Google Scholar]
- Liu, C.; Zhao, L. Using fuzzy measure entropy to improve the stability of traditional entropy measures. In Proceedings of the 2011 Computing in Cardiology, Hangzhou, China, 18–21 September 2011; pp. 681–684. [Google Scholar]
- Zaylaa, A.; Saleh, S.; Karameh, F.; Nahas, Z.; Bouakaz, A. Cascade of nonlinear entropy and statistics to discriminate fetal heart rates. In Proceedings of the 2016 3rd International Conference on Advances in Computational Tools for Engineering Applications (ACTEA), Beirut, Lebanon, 13–15 July 2016; pp. 152–157. [Google Scholar]
- Mekyska, J.; Janousova, E.; Gomez-Vilda, P.; Smekal, Z.; Rektorova, I.; Eliasova, I.; Kostalova, M.; Mrackova, M.; Alonso-Hernandez, J.B.; Faundez-Zanuy, M.; et al. Robust and complex approach of pathological speech signal analysis. Neurocomputing
**2015**, 167, 94–111. [Google Scholar] [CrossRef] - Costa, M.; Goldberger, A.L.; Peng, C.K. Multiscale entropy analysis of complex physiologic time series. Phys. Rev. Lett.
**2002**, 89, 068102. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Zhang, Y.C. Complexity and 1/f noise. A phase space approach. J. Phys. I
**1991**, 1, 971–977. [Google Scholar] [CrossRef] - Hsu, C.F.; Wei, S.Y.; Huang, H.P.; Hsu, L.; Chi, S.; Peng, C.K. Entropy of entropy: Measurement of dynamical complexity for biological systems. Entropy
**2017**, 19, 550. [Google Scholar] [CrossRef] - Wu, S.D.; Wu, C.W.; Lin, S.G.; Wang, C.C.; Lee, K.Y. Time series analysis using composite multiscale entropy. Entropy
**2013**, 15, 1069–1084. [Google Scholar] [CrossRef] [Green Version] - Valencia, J.F.; Porta, A.; Vallverdu, M.; Claria, F.; Baranowski, R.; Orlowska-Baranowska, E.; Caminal, P. Refined multiscale entropy: Application to 24-h holter recordings of heart period variability in healthy and aortic stenosis subjects. IEEE Trans. Biomed. Eng.
**2009**, 56, 2202–2213. [Google Scholar] [CrossRef] - Wu, S.D.; Wu, C.W.; Lee, K.Y.; Lin, S.G. Modified multiscale entropy for short-term time series analysis. Phys. A Stat. Mech. Appl.
**2013**, 392, 5865–5873. [Google Scholar] [CrossRef] - Costa, M.D.; Goldberger, A.L. Generalized multiscale entropy analysis: Application to quantifying the complex volatility of human heartbeat time series. Entropy
**2015**, 17, 1197–1203. [Google Scholar] [CrossRef] - Ahmed, M.U.; Mandic, D.P. Multivariate multiscale entropy: A tool for complexity analysis of multichannel data. Phys. Rev. E
**2011**, 84, 061918. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Kolmogorov, A.N. Three approaches to the quantitative definition ofinformation. Probl. Inf. Transm.
**1965**, 1, 1–7. [Google Scholar] - Solomonoff, R.J. A formal theory of inductive inference. Part I. Inf. Control
**1964**, 7, 1–22. [Google Scholar] [CrossRef] [Green Version] - Chaitin, G.J. On the length of programs for computing finite binary sequences. J. ACM (JACM)
**1966**, 13, 547–569. [Google Scholar] [CrossRef] - Teixeira, A.; Matos, A.; Souto, A.; Antunes, L. Entropy measures vs. Kolmogorov complexity. Entropy
**2011**, 13, 595–611. [Google Scholar] [CrossRef] - Zegers, P. Fisher information properties. Entropy
**2015**, 17, 4918–4939. [Google Scholar] [CrossRef] [Green Version] - Fisher, R.A. Theory of statistical estimation. Mathematical Proceedings of the Cambridge Philosophical Society; Cambridge University Press: Cambridge, UK, 1925; Volume 22, pp. 700–725. [Google Scholar]
- Blahut, R.E. Principles and Practice of Information Theory; Addison-Wesley Longman Publishing Co., Inc.: Cambridge, MA, USA, 1987. [Google Scholar]
- Stam, A.J. Some inequalities satisfied by the quantities of information of Fisher and Shannon. Inf. Control
**1959**, 2, 101–112. [Google Scholar] [CrossRef] [Green Version] - Borzadaran, G.M. Relationship between entropies, variance and Fisher information. In Proceedings of the AIP Conference Proceedings; American Institute of Physics: Melville, NY, USA, 2001; Volume 568, pp. 139–144. [Google Scholar]
- Mukher jee, D.; Ratnaparkhi, M.V. On the functional relationship between entropy and variance with related applications. Commun. Stat. Theory Methods
**1986**, 15, 291–311. [Google Scholar] - Toomaj, A.; Di Crescenzo, A. Connections between weighted generalized cumulative residual entropy and variance. Mathematics
**2020**, 8, 1072. [Google Scholar] [CrossRef] - Gibson, J. Entropy power, autoregressive models, and mutual information. Entropy
**2018**, 20, 750. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Ledoux, M.; Nair, C.; Wang, Y.N. Log-Convexity of Fisher Information along Heat Flow; University of Toulouse: Toulouse, France, 2021. [Google Scholar]
- Vieira, E.; Gomes, J. A comparison of Scopus and Web of Science for a typical university. Scientometrics
**2009**, 81, 587–600. [Google Scholar] [CrossRef] - Liu, W.; Tang, L.; Hu, G. Funding information in Web of Science: An updated overview. arXiv
**2020**, arXiv:2001.04697. [Google Scholar] [CrossRef] [Green Version] - Franceschini, F.; Maisano, D.; Mastrogiacomo, L. Empirical analysis and classification of database errors in Scopus and Web of Science. J. Inf.
**2016**, 10, 933–953. [Google Scholar] [CrossRef] - Meho, L.I.; Yang, K. Impact of data sources on citation counts and rankings of LIS faculty: Web of Science versus Scopus and Google Scholar. J. Am. Soc. Inf. Sci. Technol.
**2007**, 58, 2105–2125. [Google Scholar] [CrossRef] - Mugnaini, R.; Strehl, L. Recuperação e impacto da produção científica na era Google: Uma análise comparativa entre o Google Acadêmico e a Web of Science. In Revista Eletrônica de Biblioteconomia e ciência da Informação; Encontros Bibli: Florianopolis, Brazil, 2008; n. esp.; pp. 92–105. [Google Scholar]
- Falagas, M.E.; Pitsouni, E.I.; Malietzis, G.A.; Pappas, G. Comparison of PubMed, Scopus, Web of Science, and Google scholar: Strengths and weaknesses. FASEB J.
**2008**, 22, 338–342. [Google Scholar] [CrossRef] - Scopus. Content Selection and Advisory Board (CSAB). Available online: https://www.elsevier.com/solutions/scopus/how-scopus-works/content (accessed on 18 June 2020).

**Figure 1.**Timeline of the universe of entropies discussed in this paper. Timeline in logarithmic scale and colors refer to the section in which each entropy is defined.

**Figure 6.**Relationship between Havrda–Charvát structural $\alpha $-entropy Tsallis and others entropies.

**Figure 8.**Relations between topological information content, graph entropy and horizontal visibility graph entropy.

**Figure 11.**Number of citations by year in the WoS between 2004 and 2019 of the papers proposing each measure of entropy, in logarithmic scale ($lo{g}_{2}\left(\mathrm{Number}\phantom{\rule{4.pt}{0ex}}\mathrm{of}\phantom{\rule{4.pt}{0ex}}\mathrm{citations}\right)$). In the legend, the ordered pair ($\beta $, p-value), in papers cited in more than three years, corresponds to the slope of the regression line, $\beta $, and the respective p-value. Statistically significant slopes ($p<0.05$) are marked with *.

**Figure 12.**Number of citations by year in Scopus between 2004 and 2019 of the papers proposing each measure of entropy, in logarithmic scale ($lo{g}_{2}\left(\mathrm{Number}\phantom{\rule{4.pt}{0ex}}\mathrm{of}\phantom{\rule{4.pt}{0ex}}\mathrm{citations}\right)$). In the legend, the ordered pair ($\beta $, p-value), in papers cited in more than three years, corresponds to the slope of the regression line, $\beta $, and the respective p-value. Statistically significant slopes ($p<0.05$) are marked with *.

**Figure 13.**The ten areas that most cited each paper introducing entropies according to the Research Areas of the WoS. Legend: range 0 (research area least cited)-10 (research area most cited).

**Figure 14.**The ten areas of most cited papers that introduced entropies according to the Documents by subject area of the Scopus. Legend: range 0 (research area least cited)-10 (research area most cited).

**Table 1.**Reference and number of citations in Scopus and WoS of the paper that presented each entropy.

Name of Entropy | Reference | Year | Scopus | Web of Science |
---|---|---|---|---|

Boltzmann entropy | [25] | 1900 | 5 | - |

Gibbs entropy | [26] | 1902 | 1343 | - |

Hartley entropy | [29] | 1928 | 902 | - |

Von Newmann entropy | [30] | 1932 | 1887 | - |

Shannon/differential entropies | [33] | 1948 | 34,751 | 32,040 |

Boltzmann-Gibbs-Shannon | [42] | 1955 | 8 | 7 |

Topological information content | [100] | 1955 | 204 | - |

Maximum entropy | [83] | 1957 | 6661 | 6283 |

Kolmogorov entropy | [55] | 1958 | 693 | 662 |

Rényi entropy | [79] | 1961 | 3149 | - |

Topological entropy | [66] | 1965 | 728 | 682 |

Havrda–Charvát structural $\alpha $-entropy | [90] | 1967 | 744 | - |

Graph entropy | [102] | 1968 | 207 | 195 |

Fuzzy entropy | [118] | 1972 | 1395 | - |

Minimum entropy | [84] | 1975 | 22 | 17 |

Geometric entropy | [74] | 1988 | 71 | - |

Tsallis entropy | [91] | 1988 | 5745 | 5467 |

Approximate entropy | [112] | 1991 | 3562 | 3323 |

Spectral entropy | [49] | 1992 | 915 | 26 |

Tone-entropy | [51] | 1997 | 85 | 76 |

Sample entropy | [15] | 2000 | 3770 | 3172 |

Wavelet entropy | [52] | 2001 | 582 | 465 |

Permutation/sorting entropies | [13] | 2002 | 1900 | 1708 |

Smooth Rényi entropy | [86] | 2004 | 112 | 67 |

Kernel-based entropy | [19] | 2005 | 15 | 13 |

Quadratic sample entropy | [87] | 2005 | 65 | 68 |

Empirical mode decomposition energy entropy | [53] | 2006 | 391 | 359 |

Intrinsic mode dispersion entropy | [115] | 2007 | 59 | 55 |

Tsallis permutation entropy | [92] | 2008 | 35 | 37 |

Modified sample entropy | [17] | 2008 | 58 | 51 |

Coefficient of sample entropy | [16] | 2011 | 159 | 136 |

$\Delta -$entropy | [54] | 2011 | 13 | 10 |

Fuzzy entropy | [122] | 2011 | 23 | 18 |

Rényi permutation entropy | [14] | 2013 | 28 | 26 |

Horizontal visibility graph entropy | [109] | 2014 | 22 | - |

Rank-based entropy | [94] | 2014 | 6 | 6 |

Kernels entropies | [124] | 2015 | 46 | 39 |

Dispersion entropy | [18] | 2016 | 98 | 84 |

Buble entropy | [95] | 2017 | 25 | 21 |

Fluctuation-based dispersion entropy | [116] | 2018 | 16 | 10 |

Legend: -paper not found in database. |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Ribeiro, M.; Henriques, T.; Castro, L.; Souto, A.; Antunes, L.; Costa-Santos, C.; Teixeira, A.
The Entropy Universe. *Entropy* **2021**, *23*, 222.
https://doi.org/10.3390/e23020222

**AMA Style**

Ribeiro M, Henriques T, Castro L, Souto A, Antunes L, Costa-Santos C, Teixeira A.
The Entropy Universe. *Entropy*. 2021; 23(2):222.
https://doi.org/10.3390/e23020222

**Chicago/Turabian Style**

Ribeiro, Maria, Teresa Henriques, Luísa Castro, André Souto, Luís Antunes, Cristina Costa-Santos, and Andreia Teixeira.
2021. "The Entropy Universe" *Entropy* 23, no. 2: 222.
https://doi.org/10.3390/e23020222