Entropy 2011, 13(9), 1694-1707; doi:10.3390/e13091694

Tsallis Mutual Information for Document Classification

* email, email, email and email
Received: 1 August 2011; in revised form: 5 September 2011 / Accepted: 8 September 2011 / Published: 14 September 2011
(This article belongs to the Special Issue Tsallis Entropy)
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract: Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, the difference between entropy and conditional entropy, and the Jensen–Tsallis divergence, respectively. In addition, the ratio between these measures and the Tsallis joint entropy is analyzed. The performance of all these measures is studied for different entropic indexes in the context of document classification and registration.
Keywords: Tsallis entropy; mutual information; image similarity; document classification
PDF Full-text Download PDF Full-Text [385 KB, uploaded 14 September 2011 11:28 CEST]

Export to BibTeX |

MDPI and ACS Style

Vila, M.; Bardera, A.; Feixas, M.; Sbert, M. Tsallis Mutual Information for Document Classification. Entropy 2011, 13, 1694-1707.

AMA Style

Vila M, Bardera A, Feixas M, Sbert M. Tsallis Mutual Information for Document Classification. Entropy. 2011; 13(9):1694-1707.

Chicago/Turabian Style

Vila, Màrius; Bardera, Anton; Feixas, Miquel; Sbert, Mateu. 2011. "Tsallis Mutual Information for Document Classification." Entropy 13, no. 9: 1694-1707.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert