# A Brief Review of Generalized Entropies

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Generalized Entropies

**SK1**- Continuity. $H({p}_{1},\dots ,{p}_{W})$ depends continuously on all variables for each W.
**SK2**- Maximality. For all W,$$H({p}_{1},\dots ,{p}_{W})\le H({\textstyle \frac{1}{W}},\dots ,{\textstyle \frac{1}{W}}).$$
**SK3**- Expansibility: For all W and $1\le i\le W$,$$H(0,{p}_{1},\dots ,{p}_{W})=H({p}_{1},\dots ,{p}_{i},0,{p}_{i+1},\dots ,{p}_{W})=H({p}_{1},\dots ,{p}_{i},{p}_{i+1},\dots ,{p}_{W}).$$
**SK4**- Separability (or strong additivity): For all $W,U$,$$\begin{array}{c}H({p}_{11},\dots ,{p}_{1U},{p}_{21},\dots {p}_{2U},\dots ,{p}_{W1},\dots ,{p}_{WU})\phantom{\rule{85.35826pt}{0ex}}\hfill \\ =H({p}_{1\xb7},{p}_{2\xb7},\dots ,{p}_{W\xb7})+\sum _{i=1}^{W}{p}_{i\xb7}H\left(\right)open="("\; close=")">\frac{{p}_{i1}}{{p}_{i\xb7}},\frac{{p}_{i2}}{{p}_{i\xb7}},\dots ,\frac{{p}_{iU}}{{p}_{i\xb7}},\hfill \end{array}$$

- (
**i**) - Symmetry: ${F}_{g}({p}_{1},\dots ,{p}_{W})$ is invariant under permutation of ${p}_{1},\dots ,{p}_{W}$.
- (
**ii**) - ${F}_{g}$ satisfies axiom SK1 if and only if g is continuous.
- (
**iii**) - If ${F}_{g}$ satisfies axiom SK2, then$$\sum _{i=1}^{W}g({p}_{i})\le Wg({\textstyle \frac{1}{W}})$$
- (
**iv**) - If g is concave (i.e., ∩-convex), then ${F}_{g}$ satisfies axiom SK2.
- (
**v**) - ${F}_{g}$ satisfies axiom SK3 if and only if $g(0)=0$.

- (
**C1**) - g is continuous.
- (
**C2**) - g is concave.
- (
**C3**) - $g(0)=0$.

## 3. Examples of Generalized Entropies

#### 3.1. Tsallis Entropy

- (
**T1**) - ${T}_{1}={S}_{BGS}$ because ${ln}_{1}{p}_{i}=ln{p}_{i}$ (or ${D}_{1}f(x)=df(x)/dx$).
- (
**T2**) - ${T}_{q}$ is (strictly) ∩-convex for $q>0$. Figure 1 plots ${T}_{q}(p,1-p)$ for $q=0.5$, 1, 2 and 5. Let us mention in passing that ${T}_{q}$ is ∪-convex for $q<0$.
- (
**T3**) - (
**T4**) - (
**T5**) - Similar to what happens with the BGS entropy, Tsallis entropy can be uniquely determined (except for a multiplicative positive constant) by a small number of axioms. Thus, Abe [35] characterized the Tsallis entropy by: (i) continuity; (ii) the increasing monotonicity of ${T}_{q}(1/W,\dots ,1/W)$ with respect to W; (iii) expansivity; and (iv) a property involving conditional entropies. Dos Santos [36], on the other hand, used the previous Axioms (i) and (ii), q-additivity, and a generalization of the grouping axiom (Equation (9)). Suyari [37] derived ${T}_{q}$ from the first three Shannon–Khinchin axioms and a generalization of the fourth one. The perhaps most economical characterization of ${T}_{q}$ was given by Furuichi [38]; it consists of continuity, symmetry under the permutation of ${p}_{1},\dots ,{p}_{W}$, and a property called q-recursivity. As mentioned in Section 2, Tsallis entropy was recently shown [27] to be the only composable generalized entropy of the form in Equation (10) under some technical assumptions. Further axiomatic characterizations of the Tsallis entropy can be found in [39].

#### 3.2. Rényi Entropy

- (
**R1**) - ${R}_{q}$ is additive by construction.
- (
**R2**) - ${R}_{1}\equiv {lim}_{q\to 1}{R}_{q}={S}_{BGS}$. Indeed, use L’Hôpital’s Rule to derive$$\begin{array}{c}\hfill \underset{q\to 1}{lim}\frac{1}{1-q}ln\left(\right)open="("\; close=")">\sum _{i=1}^{W}{p}_{i}^{q}\\ =& -\underset{q\to 1}{lim}\frac{d}{dq}ln\left(\right)open="("\; close=")">\sum _{i=1}^{W}{p}_{i}^{q}\hfill \end{array}& =& -\underset{q\to 1}{lim}\frac{1}{{\sum}_{i=1}^{W}{p}_{i}^{q}}\sum _{i=1}^{W}{p}_{i}^{q}ln{p}_{i}\hfill \\ & =& -\sum _{i=1}^{W}{p}_{i}ln{p}_{i}.\hfill $$
- (
**R3**) - ${R}_{q}$ is ∩-convex for $0<q\le 1$ and it is neither ∩-convex nor ∪-convex for $q>1$. Figure 2 plots ${R}_{q}(p,1-p)$ for $q=0.5$, 1, 2 and 5.
- (
**R4**) - (
**R5**) - The entropies ${R}_{q}$ are monotonically decreasing with respect to the parameter q for any distribution of probabilities, i.e.,$$q<{q}^{\prime}\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}\u27f9\phantom{\rule{0.277778em}{0ex}}\phantom{\rule{0.277778em}{0ex}}{R}_{q}\ge {R}_{{q}^{\prime}}.$$This property follows from the formula$$-\frac{d{R}_{q}}{dq}=\frac{1}{{(1-q)}^{2}}\sum _{i=1}^{W}{p}_{i}^{\prime}ln\frac{{p}_{i}^{\prime}}{{p}_{i}}=\frac{1}{{(1-q)}^{2}}D(\left\{{p}_{i}^{\prime}\right\}\left(\right)open="\parallel "\; close=")">\left\{{p}_{i}\right\}$$
- (
**R6**) - A straightforward relation between Rényi’s and Tsallis’ entropies is the following [50]:$${T}_{q}=\frac{1}{1-q}\left(\right)open="("\; close=")">{e}^{(1-q){R}_{q}}-1.$$

#### 3.3. Graph Related Entropies

## 4. Hanel–Thurner Exponents

- (
**E1**) - For the BGS entropy, $g(x)=-xlnx$ (see Equation (11)), so$$\frac{g\left(\right)open="("\; close=")">zx}{}g\left(x\right)$$$$\frac{g\left(\right)open="("\; close=")">{x}^{1+a}}{}{x}^{ac}g\left(x\right)$$
- (
**E2**) - For the Tsallis entropy, see Equation (12),$$g(x)=\left(\right)open="\{"\; close>\begin{array}{cc}\frac{1}{1-q}{x}^{q}+\mathcal{O}(x)& \mathrm{if}\phantom{\rule{4.pt}{0ex}}0q1,\hfill \\ -\frac{1}{1-q}x+\mathcal{O}(x)& \mathrm{if}\phantom{\rule{4.pt}{0ex}}q1.\hfill \end{array}$$It follows readily that $(c,d)=(q,0)$ if $0<q<1$, and $(c,d)=(1,0)$ if $q>1$. Hence, although ${lim}_{q\to 1}{T}_{q}={S}_{BGS}$, there is no parallel convergence concerning the HT exponents.
- (
**E3**) - For the Rényi entropy, $g(x)={x}^{q}$ and $G(u)=\frac{1}{1-q}{e}^{u}$ (see Equation (15)), so$$\frac{G(\frac{\lambda}{x}g\left(\right)open="("\; close=")">\frac{x}{\lambda}}{)}G(\frac{1}{x}g\left(x\right))$$$$\frac{G\left(\right)open="("\; close=")">\frac{1}{{x}^{1+a}}g\left(\right)open="("\; close=")">{x}^{1+a}}{}$$

## 5. Asymptotic Relation between the HT Exponent c and the Diffusion Scaling Exponent

#### 5.1. The Non-Stationary Regime

#### 5.2. Relation between the Stationary and Non-Stationary Regime

## 6. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## Appendix A

- ${S}_{\eta}(\left\{{p}_{i}\right\})={\sum}_{i}\Gamma (\frac{\eta +1}{\eta},ln{p}_{i})-{p}_{i}\Gamma (\frac{\eta +1}{\eta})\phantom{\rule{0.277778em}{0ex}}(\eta >0)$ [67].
- ${S}_{\kappa}(\left\{{p}_{i}\right\})={\sum}_{i}\frac{{p}_{i}^{1-\kappa}-{p}_{i}^{1+\kappa}}{2\kappa}\phantom{\rule{0.277778em}{0ex}}(0<\kappa <1)$ [68].
- ${S}_{b}(\left\{{p}_{i}\right\})={\sum}_{i}(1-{e}^{b{p}_{i}})+{e}^{-b}-1\phantom{\rule{0.277778em}{0ex}}(b>0)$ [69].
- ${S}_{E}(\left\{{p}_{i}\right\})={\sum}_{i}{p}_{i}(1-{e}^{({p}_{i}-1)/{p}_{i}})$ [70].
- ${S}_{\beta}(\left\{{p}_{i}\right\})={\sum}_{i}{p}_{i}^{\beta}ln(1/{p}_{i})\phantom{\rule{0.277778em}{0ex}}(0<\beta \le 1)$ [71].
- ${S}_{\gamma}(\left\{{p}_{i}\right\})={\sum}_{i}{p}_{i}{ln}^{1/\gamma}(1/{p}_{i})$ ([15], page 60).

## Appendix B

## Appendix C

**1.**First, note from Equation (32) that

**2.**Use the asymptotic approximation 6.5.32 of [59]

**3.**From Equation (A3), we obtain

## References

- Clausius, R. The Mechanical Theory of Heat; McMillan and Co.: London, UK, 1865. [Google Scholar]
- Boltzmann, L. Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen. Sitz. Ber. Akad. Wiss. Wien (II)
**1872**, 66, 275–370. [Google Scholar] - Boltzmann, L. Über die Beziehung eines allgemeinen mechanischen Satzes zum zweiten Hauptsatz der Wärmetheorie. Sitz. Ber. Akad. Wiss. Wien (II)
**1877**, 75, 67–73. [Google Scholar] - Gibbs, J.W. Elementary Principles in Statistical Mechanics—Developed with Especial References to the Rational Foundation of Thermodynamics; C. Scribner’s Sons: New York, NY, USA, 1902. [Google Scholar]
- Dewar, R. Information theory explanation of the fluctuation theorem, maximum entropy production and self-organized criticality in nonequilibrium stationary state. J. Phys. A Math. Gen.
**2003**, 36, 631–641. [Google Scholar] [CrossRef] - Martyushev, L.M. Entropy and entropy production: old misconceptions and new breakthroughs. Entropy
**2013**, 15, 1152–1170. [Google Scholar] [CrossRef] - Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J.
**1948**, 27, 379–423. [Google Scholar] [CrossRef] - Wissner-Gross, A.D.; Freer, C.E. Causal entropic forces. Phys. Rev. Lett.
**2013**, 110, 168702. [Google Scholar] [CrossRef] [PubMed] - Mann, R.P.; Garnett, R. The entropic basis of collective behaviour. J. R. Soc. Interface
**2015**, 12, 20150037. [Google Scholar] [CrossRef] [PubMed] - Kolmogorov, A.N. A new metric invariant of transitive dynamical systems and Lebesgue space endomorphisms. Dokl. Acad. Sci. USSR
**1958**, 119, 861–864. [Google Scholar] - Rényi, A. On measures of entropy and information. In Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability; Neyman, J., Ed.; University of California Press: Berkeley, CA, USA, 1961; pp. 547–561. [Google Scholar]
- Tsallis, C. Possible generalization of Boltzmann–Gibbs statistics. J. Stat. Phys.
**1988**, 52, 479–487. [Google Scholar] [CrossRef] - Amigó, J.M.; Keller, K.; Unakafova, V. On entropy, entropy-like quantities, and applications. Disc. Cont. Dyn. Syst. B
**2015**, 20, 3301–3343. [Google Scholar] [CrossRef] [Green Version] - Csiszár, I. Axiomatic characterization of information measures. Entropy
**2008**, 10, 261–273. [Google Scholar] [CrossRef] - Tsallis, C. Introduction to Nonextensive Statistical Mechanics; Springer: New York, NY, USA, 2009. [Google Scholar]
- Hanel, R.; Thurner, S. A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions. EPL
**2011**, 93, 20006. [Google Scholar] [CrossRef] - Principe, J.C. Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives; Springer: New York, NY, USA, 2010. [Google Scholar]
- Hernández, S. Introducing Graph Entropy. Available online: http://entropicai.blogspot.com/search/label/Graph%20entropy (accessed on 22 October 2018).
- Salicrú, M.; Menéndez, M.L.; Morales, D.; Pardo, L. Asymptotic distribution of (h, ϕ)-entropies. Commun. Stat. Theory Meth.
**1993**, 22, 2015–2031. [Google Scholar] [CrossRef] - Bosyk, G.M.; Zozor, S.; Holik, F.; Portesi, M.; Lamberti, P.W. A family of generalized quantum entropies: Definition and properties. Quantum Inf. Process.
**2016**, 15, 3393–3420. [Google Scholar] [CrossRef] - Von Neumann, J. Thermodynamik quantenmechanischer Gesamtheiten. Nachrichten von der Gesellschaft der Wissenschaften zu Göttingen
**1927**, 1927, 273–291. (In German) [Google Scholar] - Hein, C.A. Entropy in Operational Statistics and Quantum Logic. Found. Phys.
**1979**, 9, 751–786. [Google Scholar] [CrossRef] - Short, A.J.; Wehner, S. Entropy in general physical theories. New J. Phys.
**2010**, 12, 033023. [Google Scholar] [CrossRef] [Green Version] - Holik, F.; Bosyk, G.M.; Bellomo, G. Quantum information as a non-Kolmogovian generalization of Shannon’s theory. Entropy
**2015**, 17, 7349–7373. [Google Scholar] [CrossRef] - Portesi, M.; Holik, F.; Lamberti, P.W.; Bosyk, G.M.; Bellomo, G.; Zozor, S. Generalized entropie in quantum and classical statistical theories. Eur. Phys. J. Spec. Top.
**2018**, 227, 335–344. [Google Scholar] [CrossRef] - Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley and Sons: Hoboken, NJ, USA, 2006. [Google Scholar]
- Enciso, A.; Tempesta, P. Uniqueness and characterization theorems for generalized entropies. J. Stat. Mech.
**2017**, 123101. [Google Scholar] [CrossRef] - Khinchin, A.I. Mathematical Foundations of Information Theory; Dover Publications: New York, NY, USA, 1957. [Google Scholar]
- Ash, R.B. Information Theory; Dover Publications: New York, NY, USA, 1990. [Google Scholar]
- MacKay, D.J. Information Theory, Inference, and Earning Algorithms; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Bandt, C. A new kind of permutation entropy used to classify sleep stages from invisible EEG microstructure. Entropy
**2017**, 19, 197. [Google Scholar] [CrossRef] - Havrda, J.; Charvát, F. Quantification method of classification processes. Concept of structural α-entropy. Kybernetika
**1967**, 3, 30–35. [Google Scholar] - Abe, S. Stability of Tsallis entropy and instabilities of Renyi and normalized Tsallis entropies. Phys. Rev. E
**2002**, 66, 046134. [Google Scholar] [CrossRef] [PubMed] - Tsallis, C.; Brigatti, E. Nonextensive statistical mechanics: A brief introduction. Contin. Mech. Thermodyn.
**2004**, 16, 223–235. [Google Scholar] [CrossRef] [Green Version] - Abe, S. Tsallis entropy: How unique? Contin. Mech. Thermodyn.
**2004**, 16, 237–244. [Google Scholar] [CrossRef] - Dos Santos, R.J.V. Generalization of Shannon’s theorem for Tsallis entropy. J. Math. Phys.
**1997**, 38, 4104–4107. [Google Scholar] [CrossRef] - Suyari, H. Generalization of Shannon–Khinchin axioms to nonextensive systems and the uniqueness theorem for the nonextensive entropy. IEEE Trans. Inf. Theory
**2004**, 50, 1783–1787. [Google Scholar] [CrossRef] - Furuichi, S. On uniqueness theorems for Tsallis entropy and Tsallis relative entropy. IEEE Trans. Inf. Theory
**2005**, 51, 3638–3645. [Google Scholar] [CrossRef] - Jäckle, S.; Keller, K. Tsallis entropy and generalized Shannon additivity. Axioms
**2016**, 6, 14. [Google Scholar] [CrossRef] - Hanel, R.; Thurner, S. When do generalized entropies apply? How phase space volume determines entropy. Europhys. Lett.
**2011**, 96, 50003. [Google Scholar] [CrossRef] [Green Version] - Plastino, A.R.; Plastino, A. Stellar polytropes and Tsallis’ entropy. Phys. Lett. A
**1993**, 174, 384–386. [Google Scholar] [CrossRef] - Alemany, P.A.; Zanette, D.H. Fractal random walks from a variational formalism for Tsallis entropies. Phys. Rev. E
**1994**, 49, R956–R958. [Google Scholar] [CrossRef] - Plastino, A.R.; Plastino, A. Non-extensive statistical mechanics and generalized Fokker–Planck equation. Physica A
**1995**, 222, 347–354. [Google Scholar] [CrossRef] - Tsallis, C.; Bukman, D.J. Anomalous diffusion in the presence of external forces: Exact time-dependent solutions and their thermostatistical basis. Phys. Rev. E
**1996**, 54, R2197. [Google Scholar] [CrossRef] - Capurro, A.; Diambra, L.; Lorenzo, D.; Macadar, O.; Martin, M.T.; Mostaccio, C.; Plastino, A.; Rofman, E.; Torres, M.E.; Velluti, J. Tsallis entropy and cortical dynamics: The analysis of EEG signals. Physica A
**1998**, 257, 149–155. [Google Scholar] [CrossRef] - Maszczyk, T.; Duch, W. Comparison of Shannon, Renyi and Tsallis entropy used in decision trees. In Proceedings of the International Conference on Artificial Intelligence and Soft Computing, Zakopane, Poland, 22–26 June 2008; Springer: Berlin, Germany, 2008; pp. 643–651. [Google Scholar]
- Gajowniczek, K.; Karpio, K.; Łukasiewicz, P.; Orłowski, A.; Zabkowski, T. Q-Entropy approach to selecting high income households. Acta Phys. Pol. A
**2015**, 127, 38–44. [Google Scholar] [CrossRef] - Gajowniczek, K.; Orłowski, A.; Zabkowski, T. Simulation study on the application of the generalized entropy concept in artificial neural networks. Entropy
**2018**, 20, 249. [Google Scholar] [CrossRef] - Lesche, B. Instabilities of Renyi entropies. J. Stat. Phys.
**1982**, 27, 419–422. [Google Scholar] [CrossRef] - Mariz, A.M. On the irreversible nature of the Tsallis and Renyi entropies. Phys. Lett. A
**1992**, 165, 409–411. [Google Scholar] [CrossRef] - Aczél, J.; Daróczy, Z. Charakterisierung der Entropien positiver Ordnung und der Shannonschen Entropie. Acta Math. Acad. Sci. Hung.
**1963**, 14, 95–121. (In German) [Google Scholar] [CrossRef] - Jizba, P.; Arimitsu, T. The world according to Rényi: Thermodynamics of multifractal systems. Ann. Phys.
**2004**, 312, 17–59. [Google Scholar] [CrossRef] - Rényi, A. On the foundations of information theory. Rev. Inst. Int. Stat.
**1965**, 33, 1–4. [Google Scholar] [CrossRef] - Campbell, L.L. A coding theorem and Rényi’s entropy. Inf. Control
**1965**, 8, 423–429. [Google Scholar] [CrossRef] - Csiszár, I. Generalized cutoff rates and Rényi information measures. IEEE Trans. Inf. Theory
**1995**, 41, 26–34. [Google Scholar] [CrossRef] - Bennett, C.; Brassard, G.; Crépeau, C.; Maurer, U. Generalized privacy amplification. IEEE Trans. Inf. Theory
**1995**, 41, 1915–1923. [Google Scholar] [CrossRef] - Kannathal, N.; Choo, M.L.; Acharya, U.R.; Sadasivan, P.K. Entropies for detection of epilepsy in EEG. Comput. Meth. Prog. Biomed.
**2005**, 80, 187–194. [Google Scholar] [CrossRef] [PubMed] - Contreras-Reyes, J.E.; Cortés, D.D. Bounds on Rényi and Shannon Entropies for Finite Mixtures of Multivariate Skew-Normal Distributions: Application to Swordfish (Xiphias gladius Linnaeus). Entropy
**2016**, 11, 382. [Google Scholar] [CrossRef] - Abramowitz, M.; Stegun, I.A. Handbook of Mathematical Tables; Dover Publications: New York, NY, USA, 1972. [Google Scholar]
- Hanel, R.; Thurner, S. Generalized (c, d)-entropy and aging random walks. Entropy
**2013**, 15, 5324–5337. [Google Scholar] [CrossRef] [Green Version] - Chavanis, P.H. Nonlinear mean field Fokker–Planck equations. Application to the chemotaxis of biological populations. Eur. Phys. J. B
**2008**, 62, 179–208. [Google Scholar] [CrossRef] - Martinez, S.; Plastino, A.R.; Plastino, A. Nonlinear Fokker–Planck equations and generalized entropies. Physica A
**1998**, 259, 183–192. [Google Scholar] [CrossRef] - Bouchaud, J.P.; Georges, A. Anomalous diffusion in disordered media: Statistical mechanisms, models and physical applications. Phys. Rep.
**1990**, 195, 127–293. [Google Scholar] [CrossRef] - Dubkov, A.A.; Spagnolo, B.; Uchaikin, V.V. Lévy flight superdiffusion: An introduction. Int. J. Bifurcat. Chaos
**2008**, 18, 2649–2672. [Google Scholar] [CrossRef] - Schwämmle, V.; Curado, E.M.F.; Nobre, F.D. A general nonlinear Fokker–Planck equation and its associated entropy. EPJ B
**2007**, 58, 159–165. [Google Scholar] [CrossRef] - Czégel, D.; Balogh, S.G.; Pollner, P.; Palla, G. Phase space volume scaling of generalized entropies and anomalous diffusion scaling governed by corresponding nonlinear Fokker–Planck equations. Sci. Rep.
**2018**, 8, 1883. [Google Scholar] [CrossRef] [PubMed] - Anteneodo, C.; Plastino, A.R. Maximum entropy approach to stretched exponential probability distributions. J. Phys. A Math. Gen.
**1999**, 32, 1089–1098. [Google Scholar] [CrossRef] - Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E
**2002**, 66, 056125. [Google Scholar] [CrossRef] [PubMed] - Curado, E.M.; Nobre, F.D. On the stability of analytic entropic forms. Physica A
**2004**, 335, 94–106. [Google Scholar] [CrossRef] - Tsekouras, G.A.; Tsallis, C. Generalized entropy arising from a distribution of q indices. Phys. Rev. E
**2005**, 71, 046144. [Google Scholar] [CrossRef] [PubMed] - Shafee, F. Lambert function and a new non-extensive form of entropy. IMA J. Appl. Math.
**2007**, 72, 785–800. [Google Scholar] [CrossRef] [Green Version]

**Figure 3.**Entropies ${H}_{i}(p,1-p)$, $i=1,2,3$, along with ${S}_{BGS}(p,1-p)$ and ${H}_{2}-{S}_{BGS}-1$ for comparison.

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Amigó, J.M.; Balogh, S.G.; Hernández, S.
A Brief Review of Generalized Entropies. *Entropy* **2018**, *20*, 813.
https://doi.org/10.3390/e20110813

**AMA Style**

Amigó JM, Balogh SG, Hernández S.
A Brief Review of Generalized Entropies. *Entropy*. 2018; 20(11):813.
https://doi.org/10.3390/e20110813

**Chicago/Turabian Style**

Amigó, José M., Sámuel G. Balogh, and Sergio Hernández.
2018. "A Brief Review of Generalized Entropies" *Entropy* 20, no. 11: 813.
https://doi.org/10.3390/e20110813