Next Article in Journal
An Artificial Bee Colony Algorithm for the Job Shop Scheduling Problem with Random Processing Times
Next Article in Special Issue
Projective Power Entropy and Maximum Tsallis Entropy Distributions
Previous Article in Journal
Relativistic Statistical Mechanics vs. Relativistic Thermodynamics
Previous Article in Special Issue
A Risk Profile for Information Fusion Algorithms
Entropy 2011, 13(9), 1694-1707; doi:10.3390/e13091694

Tsallis Mutual Information for Document Classification

* ,
Institut d’Informàtica i Aplicacions, Universitat de Girona, Campus Montilvi, Girona 17071, Spain
* Author to whom correspondence should be addressed.
Received: 1 August 2011 / Revised: 5 September 2011 / Accepted: 8 September 2011 / Published: 14 September 2011
(This article belongs to the Special Issue Tsallis Entropy)
View Full-Text   |   Download PDF [385 KB, uploaded 24 February 2015]   |  


Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, the difference between entropy and conditional entropy, and the Jensen–Tsallis divergence, respectively. In addition, the ratio between these measures and the Tsallis joint entropy is analyzed. The performance of all these measures is studied for different entropic indexes in the context of document classification and registration.
Keywords: Tsallis entropy; mutual information; image similarity; document classification Tsallis entropy; mutual information; image similarity; document classification
This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Share & Cite This Article

Further Mendeley | CiteULike
Export to BibTeX |
EndNote |
MDPI and ACS Style

Vila, M.; Bardera, A.; Feixas, M.; Sbert, M. Tsallis Mutual Information for Document Classification. Entropy 2011, 13, 1694-1707.

View more citation formats

Related Articles

Article Metrics

For more information on the journal, click here


[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert