# Tsallis Entropy and Generalized Shannon Additivity

^{*}

## Abstract

**:**

## 1. Introduction

**Some history.**In 1988 Tsallis [1] generalized the Boltzmann-Gibbs entropy

**Tsallis entropy.**In the following, let ${\u25b5}_{n}=\{({p}_{1},{p}_{2},\dots ,{p}_{n})\in {({\mathbb{R}}^{+})}^{n},{\sum}_{i=1}^{n}{p}_{i}=1\}$ for $n\in \mathbb{N}$ be the set of all n-dimensional stochastic vectors and $\u25b5={\bigcup}_{n\in \mathbb{N}}{\u25b5}_{n}$ be the set of all stochastic vectors, where $\mathbb{N}=\{1,2,3,\dots \}$ and ${\mathbb{R}}^{+}$ are the sets of natural numbers and of nonnegative real numbers, respectively. Given $\alpha >0$ with $\alpha \ne 1$, the Tsallis entropy of a stochastic vector $({p}_{1},{p}_{2},\dots ,{p}_{n})$ of some dimension n is defined by

**Axiomatic characterizations.**One line of characterizations mainly followed by Suyari [7] and discussed in this paper has its origin in the Shannon-Khinchin axioms of Shannon entropy (see [8,9]). Note that other characterizations of Tsallis entropy are due to dos Santos [10], Abe [11] and Furuichi [12]. For some general discussion of axiomatization of entropies see [13].

**The main result.**In this paper, we study the role of generalized Shannon additivity in characterizing Tsallis entropy, where for $\alpha \in {\mathbb{R}}^{+}\backslash \{0\}$ and $H:\u25b5\to \mathbb{R}$ we also consider the slightly relaxed property that

**Theorem**

**1.**

- (i)
- If $\alpha \ne 1,2$, then$$\begin{array}{c}\hfill H({p}_{1},{p}_{2},\dots ,{p}_{n})=H\left(\frac{1}{2},\frac{1}{2}\right)\frac{1-{\sum}_{i=1}^{n}{p}_{i}^{\alpha}}{1-{2}^{1-\alpha}}forall({p}_{1},{p}_{2},\dots ,{p}_{n})\in \u25b5.\end{array}$$
- (ii)
- If $\alpha =2$, then the following statements are equivalent:
- (a)
- It holds$$\begin{array}{c}\hfill H({p}_{1},{p}_{2},\dots ,{p}_{n})=2H\left(\frac{1}{2},\frac{1}{2}\right)\left(1-\sum _{i=1}^{n}{p}_{i}^{2}\right)forall({p}_{1},{p}_{2},\dots ,{p}_{n})\in \u25b5,\end{array}$$
- (b)
- H is bounded on ${\u25b5}_{2}$,
- (c)
- H is continuous on ${\u25b5}_{2}$,
- (d)
- H is symmetric on ${\u25b5}_{2}$,
- (e)
- H does not change the signum on ${\u25b5}_{2}$.

- (iii)
- If $\alpha =1$, then the following statements are equivalent:
- (a)
- It holds$$\begin{array}{c}\hfill H({p}_{1},{p}_{2},\dots ,{p}_{n})=-\frac{H\left(\frac{1}{2},\frac{1}{2}\right)}{ln2}\left(\sum _{i=1}^{n}{p}_{i}{ln}_{2}{p}_{i}\right)forall({p}_{1},{p}_{2},\dots ,{p}_{n})\in \u25b5,\end{array}$$
- (b)
- H is bounded on ${\u25b5}_{2}$.

## 2. Proof of the Main Result

**Lemma**

**1.**

**Proof.**

**Lemma**

**2.**

**Proof.**

**Lemma**

**3.**

- (i)
- If $\alpha \ne 2$, then$$H({p}_{1},{p}_{2})=H\left(\frac{1}{2},\frac{1}{2}\right)\frac{1-{p}_{1}^{\alpha}-{p}_{2}^{\alpha}}{1-{2}^{1-\alpha}}\mathit{for}\mathit{all}({p}_{1},{p}_{2})\in {\u25b5}_{2}.$$
- (ii)
- If $\alpha =2$, then the following statements are equivalent:
- (a)
- It holds$$\begin{array}{c}\hfill H({p}_{1},{p}_{2})=2H\left(\frac{1}{2},\frac{1}{2}\right)(1-{p}_{1}^{2}-{p}_{2}^{2})forall({p}_{1},{p}_{2})\in {\u25b5}_{2},\end{array}$$
- (b)
- H is symmetric on ${\u25b5}_{2}$, meaning that $H({p}_{1},{p}_{2})=H({p}_{2},{p}_{1})$ for all $({p}_{1},{p}_{2})\in {\u25b5}_{2}$,
- (c)
- H is continuous on ${\u25b5}_{2}$,
- (d)
- H is bounded on ${\u25b5}_{2}$,
- (e)
- H is nonnegative or nonpositive on ${\u25b5}_{2}$.

**Proof.**

## 3. Further Discussion

**Proposition**

**1.**

**Proof.**

**Problem**

**1.**

## Author Contributions

## Conflicts of Interest

## References

- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys.
**1988**, 52, 479–487. [Google Scholar] [CrossRef] - Cartwright, J. Roll over, Boltzmann. Phys. World
**2014**, 27, 31–35. [Google Scholar] [CrossRef] - Tsallis, C. Approach of complexity in nature: Entropic nonuniqueness. Axioms
**2016**, 5, 20. [Google Scholar] [CrossRef] - Havrda, J.; Charvát, F. Quantification method of classification processes. Concept of structural α-entropy. Kybernetika
**1967**, 3, 30–35. [Google Scholar] - Amigó, J.M.; Keller, K.; Unakafova, V.A. On entropy, entropy-like quantities, and applications. Discrete Contin. Dyn. Syst. B
**2015**, 20, 3301–3343. [Google Scholar] [CrossRef] - Guariglia, E. Entropy and Fractal Antennas. Entropy
**2016**, 18, 1–17. [Google Scholar] [CrossRef] - Suyari, H. Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for the nonextensive entropy. IEEE T. Inform. Theory
**2004**, 50, 1783–1787. [Google Scholar] [CrossRef] - Khinchin, A.I. Mathematical Foundations of Information Theory; Dover: New York, NY, USA, 1957. [Google Scholar]
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J.
**1948**, 27, 397–423 and 623–656. [Google Scholar] [CrossRef] - Dos Santos, R.J.V. Generalization of Shannon’s theorem for Tsallis entropy. J. Math. Phys.
**1997**, 38, 4104–4107. [Google Scholar] [CrossRef] - Abe, S. Tsallis entropy: How unique? Contin. Mech. Thermodyn.
**2004**, 16, 237–244. [Google Scholar] [CrossRef] - Furuichi, S. On uniqueness Theorems for Tsallis entropy and Tsallis relative entropy. IEEE Trans. Inf. Theory
**2005**, 51, 3638–3645. [Google Scholar] [CrossRef] - Csiszár, I. Axiomatic characterizations of information measures. Entropy
**2008**, 10, 261–273. [Google Scholar] [CrossRef] - Ilić, V.M.; Stanković, M.S.; Mulalić, E.H. Comments on “Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for the nonextensive entropy”. IEEE Trans. Inf. Theory
**2013**, 59, 6950–6952. [Google Scholar] [CrossRef] - Diderrich, G.T. The role of boundedness in characterizing Shannon entropy. Inf. Control
**1975**, 29, 140–161. [Google Scholar] [CrossRef] - Faddeev, D.F. On the concept of entropy of a finite probability scheme. Uspehi Mat. Nauk
**1956**, 11, 227–231. (In Russian) [Google Scholar] - Daróczy, Z.; Maksa, D. Nonnegative information functions. Analytic function methods in probability theory. Colloq. Math. Soc. Janos Bolyai
**1982**, 21, 67–78. [Google Scholar] - Nambiar, K.K.; Varma, P.K.; Saroch, V. An axiomatic definition of Shannon’s entropy. App. Math. Lett.
**1992**, 5, 45–46. [Google Scholar] [CrossRef]

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Jäckle, S.; Keller, K. Tsallis Entropy and Generalized Shannon Additivity. *Axioms* **2017**, *6*, 14.
https://doi.org/10.3390/axioms6020014

**AMA Style**

Jäckle S, Keller K. Tsallis Entropy and Generalized Shannon Additivity. *Axioms*. 2017; 6(2):14.
https://doi.org/10.3390/axioms6020014

**Chicago/Turabian Style**

Jäckle, Sonja, and Karsten Keller. 2017. "Tsallis Entropy and Generalized Shannon Additivity" *Axioms* 6, no. 2: 14.
https://doi.org/10.3390/axioms6020014