One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if the correlations within the system are essentially local. In such cases the energy of the system is typically extensive and the entropy is additive. In general, however, the situation is not of this type and correlations may be far from negligible at all scales. Tsallis in 1988 introduced an entropic expression characterized by an index q which leads to a non-extensive statistics. Tsallis entropy, Sq, is the basis of the so called non-extensive statistical mechanics, which generalizes the Boltzmann-Gibbs theory. Tsallis statistics have found applications in a wide range of phenomena in diverse disciplines such as physics, chemistry, biology, medicine, economics, geophysics, etc. The focus of this special issue of Entropy was to solicit contributions that apply Tsallis entropy in various scientific fields.
This special issue consists of nine regular papers, covering various aspects and applications of Tsallis non-additive entropy, and an invited review paper written by Tsallis [
1]. In this review, the following aspects of Tsallis entropy are discussed: (i) Additivity
versus extensivity; (ii) Probability distributions that constitute attractors in the sense of Central Limit Theorems; (iii) The analysis of paradigmatic low-dimensional nonlinear dynamical systems near the edge of chaos; and (iv) The analysis of paradigmatic long-range-interacting many-body classical Hamiltonian systems. Finally, recent as well as typical predictions, verifications and applications of these concepts in natural, artificial, and social systems, as shown through theoretical, experimental, observational and computational results are presented.
In their paper, Zhang and Wu [
2] propose a global multi-level thresholding method for image segmentation by applying the Tsallis entropy, as a general information theory entropy formalism, and using an artificial bee colony algorithm. They demonstrate that Tsallis entropy is superior to traditional maximum entropy thresholding, based on Shannon entropy, and that the artificial bee colony is more rapid than either genetic algorithm or particle swarm optimization. Vila, Bardera, Feixas and Sbert [
3] investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. Another paper by Castelló
et al. [
4] presents a study and a comparison of the use of different information-theoretic measures for polygonal mesh simplification by applying generalized measures from Information Theory such as Havrda-Charvát-Tsallis entropy and mutual information. Baeten and Naudts [
5] present two arguments why the thermodynamic entropy of non-extensive systems involves Renyi’s entropy function rather than that of Tsallis. The first argument is that the temperature of the configurational subsystem of a mono-atomic gas is equal to that of the kinetic subsystem, and the second one is that the instability of the pendulum, which occurs for energies close to the rotation threshold, is correctly reproduced. In another paper, Nelson, Scannell and Landau [
6] clarify the relationship between the generalized mean of probabilities and the generalized entropy functions defined by Tsallis and Renyi. They show that both the Tsallis and Renyi entropy functions include the generalized mean of the probability states and their difference is the translation of the mean to an entropy scale using either the deformed logarithm for Tsallis entropy or the natural logarithm for Renyi entropy.
Telesca [
7] performs a non-extensive analysis of the southern California earthquake catalog. His results show that the non-extensivity parameter q lies in the same range as obtained for other different seismic areas, suggesting a sort of universal character in the non-extensive interpretation of seismicity. Balasis
et al. [
8] present results from the application of Tsallis statistical mechanics to the detection of dynamical changes related with the occurrence of magnetic storms and describe attempts to approach the dynamics of magnetic storms and solar flares by means of universality through Tsallis statistics. Ribeiro, Nobre and Curado [
9] derive a general nonlinear N-dimensional Fokker-Planck equation directly from a master equation, by considering nonlinearities in the transition rates. It is shown that classes of nonlinear N-dimensional Fokker-Planck equations are connected to a single entropic form and emphasis is given to the class of equations associated to Tsallis entropy, in both cases of the standard and generalized definitions for the internal energy. Finally, Eguchi, Komori and Kato [
10] discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy which is reduced to the Tsallis entropy if two distributions are taken to be equal. They investigate statistical and probabilistic properties associated with the projective power entropy, including a characterization problem of which conditions uniquely determine the projective power entropy up to the power index.
I am happy to see that this special issue includes a wide range of topics, from information theory and image processing to magnetic storms and earthquakes. It also includes a very important review by Constatino Tsallis. I hope this special issue enhances further research on Tsallis entropy. I would like to thank all the authors and reviewers who contributed to this special issue as well as Sarah Shao from the Entropy Editorial Office for her assistance with its publication.