# On Conditional Tsallis Entropy

^{1}

^{2}

^{3}

^{4}

^{5}

^{6}

^{7}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Preliminaries

## 3. Conditional Tsallis Entropy: Four Definitions

**Definition**

**1.**

- 1.
- Definition of ${T}_{\alpha}\left(Y\right|X)$ from Reference [15]$$\begin{array}{ccc}\hfill {T}_{\alpha}\left(Y\right|X)\phantom{\rule{-8.53581pt}{0ex}}& =& \phantom{\rule{-5.69054pt}{0ex}}\sum _{x}\mathrm{P}{(X=x)}^{\alpha}{T}_{\alpha}\left(Y\right|x)\hfill \end{array}$$$$\begin{array}{ccc}\hfill \phantom{\rule{-8.53581pt}{0ex}}& =& \phantom{\rule{-8.53581pt}{0ex}}\frac{1}{\alpha -1}\sum _{x}\mathrm{P}{(X=x)}^{\alpha}\left(1-\sum _{y}\mathrm{P}{(Y=y|X=x)}^{\alpha}\right)\phantom{\rule{-0.166667em}{0ex}}\phantom{\rule{-0.166667em}{0ex}}.\hfill \end{array}$$One can easily verify that ${T}_{\alpha}(X,Y)={T}_{\alpha}\left(Y\right|X)+{T}_{\alpha}\left(X\right)$ and, therefore, it satisfies the chain rule.
- 2.
- Definition of ${S}_{\alpha}\left(Y\right|X)$ from [16] (Definition 2.8)$$\begin{array}{ccc}\hfill {S}_{\alpha}\left(Y\right|X)& =& \sum _{x}\mathrm{P}(X=x){T}_{\alpha}\left(Y\right|X=x)\hfill \end{array}$$$$\begin{array}{ccc}& =& \sum _{x}\mathrm{P}(X=x)\frac{1}{\alpha -1}\left(1-\sum _{y}\mathrm{P}{(Y=y|X=x)}^{\alpha}\right)\hfill \end{array}$$$$\begin{array}{ccc}& =& \frac{1}{\alpha -1}\sum _{x}\mathrm{P}(X=x)\left(1-\sum _{y}\mathrm{P}{(Y=y|X=x)}^{\alpha}\right).\hfill \end{array}$$
- 3.
- Definition of ${S}_{\alpha}^{\prime}\left(Y\right|X)$ from [16] (Definition 2.10)$${S}_{\alpha}^{\prime}\left(Y\right|X)=\frac{1}{\alpha -1}\left({\displaystyle \frac{{\displaystyle 1-\sum _{x,y}\mathrm{P}{(X=x,Y=y)}^{\alpha}}}{{\displaystyle \sum _{x}\mathrm{P}{(X=x)}^{\alpha}}}}\right).$$

**Definition**

**2**

**Theorem**

**1.**

## 4. Comparison of the Definitions

**Theorem**

**2.**

- (i)
- ${T}_{\alpha}\left(Y\right|X)$, ${S}_{\alpha}\left(Y\right|X)$ and ${S}_{\alpha}^{\prime}\left(Y\right|X)$, as functions of α, are continuous and differentiable;
- (ii)
- ${T}_{\alpha}^{\prime}\left(Y\right|X)$, as a function of α, is continuous for all $\alpha \ne 1$.

**Theorem**

**3.**

**Proof.**

**Theorem**

**4.**

**Proof.**

**Corollary**

**1.**

**Theorem**

**5.**

**Proof.**

**Theorem**

**6.**

**Proof.**

**Theorem**

**7.**

**Proof.**

## 5. Properties of the Conditional Tsallis Entropies

**Theorem**

**8.**

**Proof.**

**Theorem**

**9.**

**Proof.**

#### Bounds on Conditional Tsallis Entropy

**Theorem**

**10.**

**Proof.**

**Theorem**

**11.**

**Proof.**

**Theorem**

**12.**

**Lemma**

**1.**

**Theorem**

**13.**

- 1.
- ${T}_{\alpha}\left(Y\right|X)$ is a non-increasing function of α.
- 2.
- ${S}_{\alpha}\left(Y\right|X)$ is a non-increasing function of α.
- 3.
- ${T}_{\alpha}^{\prime}\left(Y\right|X)$ is a non-increasing function of α.

**Proof.**

- 1.
- First consider the case $\alpha >1$, and consider the function $d{T}_{\alpha}\left(Y\right|X)$, the derivative of the function ${T}_{\alpha}\left(Y\right|X)$ in order to $\alpha $:$$\frac{d{T}_{\alpha}\left(Y\right|X)}{d\alpha}=\frac{{\displaystyle -1+\sum _{x}\mathrm{P}{(X=x)}^{\alpha}}}{{(\alpha -1)}^{2}}-\frac{{\displaystyle \sum _{x}\left(\alpha \mathrm{P}{(Y=y|X=x)}^{\alpha -1}log\alpha \right)}}{\alpha -1}.$$

- 2.
- This part of the result follows from the fact that ${S}_{\alpha}\left(Y\right|X)$ is the expectation of unconditional Tsallis entropies; see Equation (6).
- 3.
- Suppose that $\alpha >1$. The proof is a direct consequence of Equation (11) and Lemma 1. The case $\alpha <1$ can be proven in a similar way.

**Theorem**

**14.**

**Proof.**

## 6. Conclusions

- Chain Rule;
- Convergence to Shannon entropy as the parameter $\alpha $ tended to 1;
- Its value would be between 0 and the upper bound of the unconditional version.

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Conflicts of Interest

## References

- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys.
**1988**, 52, 479–487. [Google Scholar] [CrossRef] - Daróczy, Z. Generalized information functions. Inf. Control
**1970**, 16, 36–51. [Google Scholar] [CrossRef] [Green Version] - Havrda, J.; Charvat, F. Quantification method of classification processes. concept of structural α-entropy. IEEE Trans. Inf. Theory
**1967**, 3, 30–35. [Google Scholar] - Wehrl, A. General properties of entropy. Rev. Mod. Phys.
**1978**, 50, 221–260. [Google Scholar] [CrossRef] - Cover, T.; Thomas, J.A. Elements of Information Theory, 2nd ed.; Wiley: Hoboken, NJ, USA, 2006. [Google Scholar]
- Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J.
**1948**, 27, 379–423, 623–656. [Google Scholar] [CrossRef] [Green Version] - Tsallis, C. The Nonadditive Entropy Sq and Its Applications in Physics and Elsewhere: Some Remarks. Entropy
**2011**, 13, 1765–1804. [Google Scholar] [CrossRef] - Borland, R.O.L.; Tsallis, C. Distributions of high-frequency stock-market observables. In Nonextensive Entropy—Interdisciplinary Applications; Gell-Mann, M., Tsallis, C., Eds.; Oxford University Press: New York, NY, USA, 2004. [Google Scholar]
- Ibrahim, R.W.; Darus, M. Analytic Study of Complex Fractional Tsallis’ Entropy with Applications in CNNs. Entropy
**2018**, 20, 722. [Google Scholar] [CrossRef] [Green Version] - Mohanalin, B.; Kalra, P.K.; Kumar, N. A novel automatic microcalcification detection technique using Tsallis entropy and a type II fuzzy index. Comput. Math. Appl.
**2010**, 60, 2426–2432. [Google Scholar] - Tamarit, F.A.; Cannas, S.A.; Tsallis, C. Sensitivity to initial conditions in the Bak-Sneppen model of biological evolution. Eur. Phys. J. B
**1998**, 1, 545–548. [Google Scholar] [CrossRef] - Group of Statistical Physics. Available online: http://tsallis.cat.cbpf.br/biblio.htm (accessed on 8 November 2018).
- Ribeiro, M.; Henriques, T.; Castro, L.; Souto, A.; Antunes, L.; Costa-Santos, C.; Teixeira, A. The Entropy Universe. Entropy
**2021**, 23, 222. [Google Scholar] [CrossRef] - Rényi, A. On measures of information and entropy. Berkeley Symp. Math. Statist. Prob.
**1961**, 1, 547–561. [Google Scholar] - Furuichi, S. Information theoretical properties of Tsallis entropies. J. Math. Phys.
**2006**, 47, 023302. [Google Scholar] [CrossRef] [Green Version] - Manije, S.; Gholamreza, M.; Mohammad, A. Conditional Tsallis Entropy. Cyb. Inf. Technol.
**2013**, 13, 37–42. [Google Scholar] [CrossRef] - Heinrich, F.; Ramzan, F.; Rajavel, F.A.; Schmitt, A.O.; Gültas, M. MIDESP: Mutual Information-Based Detection of Epistatic SNP Pairs for Qualitative and Quantitative Phenotypes. Biology
**2021**, 10, 921. [Google Scholar] [CrossRef] [PubMed] - Oggier, F.; Datta, A.A. Renyi entropy driven hierarchical graph clustering. PeerJ Comput. Sci.
**2021**, 7, e366. [Google Scholar] [CrossRef] - Tao, M.; Wang, S.; Chen, H.; Wang, X. Information space of multi-sensor networks. Inf. Sci.
**2021**, 565, 128–245. [Google Scholar] [CrossRef] - Jozsa, R.; Schlienz, J. Distinguishability of states and von Neumann entropy. Phys. Rev. A
**2000**, 62, 012301. [Google Scholar] [CrossRef] [Green Version] - Hassani, H.; Unger, S.; Entezarian, M. Information content measurement of esg factors via entropy and its impact on society and security. Information
**2021**, 12, 391. [Google Scholar] [CrossRef] - Shannon, C.E. Communication theory of secrecy systems. Bell Syst. Tech. J.
**1949**, 28, 656–715. [Google Scholar] [CrossRef] - Bhotto, M.Z.A.; Antoniou, A. A new normalized minimum-error entropy algorithm with reduced computational complexity. In Proceedings of the 2009 IEEE International Symposium on Circuits and Systems, Taipei, Taiwan, 24–27 May 2009; pp. 2561–2564. [Google Scholar] [CrossRef]
- Teixeira, A.; Matos, A.; Souto, A.; Antunes, L. Entropy measures vs. Kolmogorov complexity. Entropy
**2011**, 13, 595–611. [Google Scholar] [CrossRef] - Teixeira, A.; Souto, A.; Matos, A.; Antunes, L. Entropy measures vs. algorithmic information. In Proceedings of the 2010 IEEE International Symposium on Information Theory, Austin, TX, USA, 13–18 June 2010; pp. 1413–1417. [Google Scholar] [CrossRef] [Green Version]
- Edgar, T.; Manz, D. Chapter 2-Science and Cyber Security. In Research Methods for Cyber Security; Syngress: Amsterdam, The Netherlands, 2017; pp. 33–62. [Google Scholar]
- Huang, L.; Shen, Y.; Zhang, G.; Luo, H. Information system security risk assessment based on multidimensional cloud model and the entropy theory. In Proceedings of the 2015 IEEE 5th International Conference on Electronics Information and Emergency Communication, Beijing, China, 14–16 May 2015; pp. 11–15. [Google Scholar]
- Lu, R.; Shen, H.; Feng, Z.; Li, H.; Zhao, W.; Li, X. HTDet: A clustering method using information entropy for hardware Trojan detection. Tsinghua Sci. Technol.
**2021**, 26, 48–61. [Google Scholar] [CrossRef] - Firman, T.; Balázsi, G.; Ghosh, K. Building Predictive Models of Genetic Circuits Using the Principle of Maximum Caliber. Biophys J.
**2017**, 113, 2121–2130. [Google Scholar] [CrossRef] - Jost, L. Entropy and diversity. Oikos
**2006**, 113, 363–375. [Google Scholar] [CrossRef] - Roach TNF. Use and Abuse of Entropy in Biology: A Case for Caliber. Entropy
**2020**, 22, 1335. [Google Scholar] [CrossRef] [PubMed] - Simpson, E. Measurement of diversity. Nature
**1949**, 163, 688. [Google Scholar] [CrossRef] - Yin, Y.; Shang, P. Weighted permutation entropy based on different symbolic approaches for financial time series. Phys. A Stat. Mech. Its Appl.
**2016**, 443, 137–148. [Google Scholar] [CrossRef] - Castiglioni, P.; Parati, G.; Faini, A. Information-Domain Analysis of Cardiovascular Complexity: Night and Day Modulations of Entropy and the Effects of Hypertension. Entropy
**2019**, 21, 550. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Polizzotto, N.R.; Takahashi, T.; Walker, C.P.; Cho, R.Y. Wide Range Multiscale Entropy Changes through Development. Entropy
**2016**, 18, 12. [Google Scholar] [CrossRef] [Green Version] - Prabhu, K.P.; Martis, R.J. Diagnosis of Schizophrenia using Kolmogorov Complexity and Sample Entropy. In Proceedings of the 2020 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT), Bangalore, India, 2–4 July 2020; pp. 1–4. [Google Scholar] [CrossRef]
- Fehr, S.; Berens, S. On the Conditional Rényi Entropy. IEEE Trans. Inf. Theory
**2014**, 60, 6801–6810. [Google Scholar] [CrossRef] - Teixeira, A.; Matos, A.; Antunes, L. Conditional Rényi Entropies. IEEE Trans. Inf. Theory
**2012**, 58, 4273–4277. [Google Scholar] [CrossRef]

**Figure 1.**Summary of the relations between the several proposals for the definition of conditional Tsallis entropy.

**Table 1.**Summary of the proved properties of all proposed conditional entropies. The question mark indicates that the property is not known to be fulfilled.

$\mathit{f}\left(\mathit{Y}\right|\mathit{X})$ | ${\mathit{T}}_{\mathit{\alpha}}\left(\mathit{Y}\right|\mathit{X})$ | ${\mathit{S}}_{\mathit{\alpha}}\left(\mathit{Y}\right|\mathit{X})$ | ${\mathit{S}}_{\mathit{\alpha}}^{\prime}\left(\mathit{Y}\right|\mathit{X})$ | ${\mathit{T}}_{\mathit{\alpha}}^{\prime}\left(\mathit{Y}\right|\mathit{X})$ |
---|---|---|---|---|

Chain Rule | yes | no | no | no |

$\underset{\alpha \to 1}{lim}f\left(Y\right|X)=H\left(Y\right|X)$ | yes | yes | yes | no |

$0\le f\left(Y\right|X)\le \frac{{\left|Y\right|}^{1-\alpha}}{1-\alpha}$ and $\alpha >1$ | yes | yes | yes | yes |

$0\le f\left(Y\right|X)\le \frac{{\left|Y\right|}^{1-\alpha}}{1-\alpha}$ and $\alpha <1$ | no | yes | ? | yes |

f is non-increasing with $\alpha $ | yes | yes | no | yes |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Teixeira, A.; Souto, A.; Antunes, L.
On Conditional Tsallis Entropy. *Entropy* **2021**, *23*, 1427.
https://doi.org/10.3390/e23111427

**AMA Style**

Teixeira A, Souto A, Antunes L.
On Conditional Tsallis Entropy. *Entropy*. 2021; 23(11):1427.
https://doi.org/10.3390/e23111427

**Chicago/Turabian Style**

Teixeira, Andreia, André Souto, and Luís Antunes.
2021. "On Conditional Tsallis Entropy" *Entropy* 23, no. 11: 1427.
https://doi.org/10.3390/e23111427