Display options:
Normal
Show Abstracts
Compact

Select/unselect all

Displaying article 1-12

p. 701-716
Received: 1 March 2012 / Revised: 19 March 2012 / Accepted: 30 March 2012 / Published: 10 April 2012

Show/Hide Abstract
| Cited by 17 | PDF Full-text (182 KB) | HTML Full-text | XML Full-text
Abstract: In this paper we utilize the Tsallis relative entropy, a generalization of the Kullback–Leibler entropy in the frame work of non-extensive thermodynamics to analyze the properties of anomalous diffusion processes. Anomalous (super-) diffusive behavior can be described by fractional diffusion equations, where the second order space derivative is extended to fractional order α ∈ (1, 2). They represent a bridging regime, where for α = 2 one obtains the diffusion equation and for α = 1 the (half) wave equation is given. These fractional diffusion equations are solved by so-called stable distributions, which exhibit heavy tails and skewness. In contrast to the Shannon or Tsallis entropy of these distributions, the Kullback and Tsallis relative entropy, relative to the pure diffusion case, induce a natural ordering of the stable distributions consistent with the ordering implied by the pure diffusion and wave limits.

p. 174-176
Received: 2 February 2012 / Accepted: 2 February 2012 / Published: 3 February 2012

Show/Hide Abstract
| Cited by 4 | PDF Full-text (41 KB) | HTML Full-text | XML Full-text
Abstract: One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if the correlations within the system are essentially local. In such cases the energy of the system is typically extensive and the entropy is additive. In general, however, the situation is not of this type and correlations may be far from negligible at all scales. Tsallis in 1988 introduced an entropic expression characterized by an index q which leads to a non-extensive statistics. Tsallis entropy, Sq, is the basis of the so called non-extensive statistical mechanics, which generalizes the Boltzmann-Gibbs theory. Tsallis statistics have found applications in a wide range of phenomena in diverse disciplines such as physics, chemistry, biology, medicine, economics, geophysics, etc. The focus of this special issue of Entropy was to solicit contributions that apply Tsallis entropy in various scientific fields. [...]

p. 1928-1944
Received: 4 October 2011 / Accepted: 21 October 2011 / Published: 1 November 2011

Show/Hide Abstract
| Cited by 19 | PDF Full-text (302 KB) | HTML Full-text | XML Full-text
Abstract: Several previous results valid for one-dimensional nonlinear Fokker-Planck equations are generalized to N-dimensions. A general nonlinear N-dimensional Fokker-Planck equation is derived directly from a master equation, by considering nonlinearitiesin the transition rates. Using nonlinear Fokker-Planck equations, the H-theorem is proved;for that, an important relation involving these equations and general entropic forms is introduced. It is shown that due to this relation, classes of nonlinear N-dimensional Fokker-Planck equations are connected to a single entropic form. A particular emphasis is given to the class of equations associated to Tsallis entropy, in both cases of the standard, and generalized definitions for the internal energy.

p. 1865-1881
Received: 1 September 2011 / Revised: 28 September 2011 / Accepted: 30 September 2011 / Published: 14 October 2011

Show/Hide Abstract
| Cited by 9 | PDF Full-text (443 KB)
Abstract: Over the last couple of decades nonextensive Tsallis entropy has shown remarkable applicability to describe nonequilibrium physical systems with large variability and multifractal structure. Herein, we review recent results from the application of Tsallis statistical mechanics to the detection of dynamical changes related with the occurrence of magnetic storms. We extend our review to describe attempts to approach the dynamics of magnetic storms and solar flares by means of universality through Tsallis statistics. We also include a discussion of possible implications on space weather forecasting efforts arising from the verification of Tsallis entropy in the complex system of the magnetosphere.

p. 1805-1828
Received: 1 August 2011 / Revised: 20 September 2011 / Accepted: 27 September 2011 / Published: 29 September 2011

Show/Hide Abstract
| Cited by 1 | PDF Full-text (10102 KB)
Abstract: This paper presents a study and a comparison of the use of different information-theoretic measures for polygonal mesh simplification. Generalized measures from Information Theory such as Havrda–Charvát–Tsallis entropy and mutual information have been applied. These measures have been used in the error metric of a surfaces implification algorithm. We demonstrate that these measures are useful for simplifying three-dimensional polygonal meshes. We have also compared these metrics with the error metrics used in a geometry-based method and in an image-driven method. Quantitative results are presented in the comparison using the root-mean-square error (RMSE).

p. 1765-1804
Received: 15 August 2011 / Revised: 11 September 2011 / Accepted: 19 September 2011 / Published: 28 September 2011

Show/Hide Abstract
| Cited by 39 | PDF Full-text (1415 KB)
Abstract: The nonadditive entropy S_{q} has been introduced in 1988 focusing on a generalization of Boltzmann–Gibbs (BG) statistical mechanics. The aim was to cover a (possibly wide) class of systems among those very many which violate hypothesis such as ergodicity, under which the BG theory is expected to be valid. It is now known that S_{q} has a large applicability; more specifically speaking, even outside Hamiltonian systems and their thermodynamical approach. In the present paper we review and comment some relevant aspects of this entropy, namely (i) Additivity versus extensivity; (ii) Probability distributions that constitute attractors in the sense of Central Limit Theorems; (iii) The analysis of paradigmatic low-dimensional nonlinear dynamical systems near the edge of chaos; and (iv) The analysis of paradigmatic long-range-interacting many-body classical Hamiltonian systems. Finally, we exhibit recent as well as typical predictions, verifications and applications of these concepts in natural, artificial, and social systems, as shown through theoretical, experimental, observational and computational results.

p. 1746-1764
Received: 26 July 2011 / Revised: 20 September 2011 / Accepted: 20 September 2011 / Published: 26 September 2011

Show/Hide Abstract
| Cited by 9 | PDF Full-text (2244 KB) | HTML Full-text | XML Full-text
Abstract: We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterization problem of which conditions uniquely determine the projective power entropy up to the power index. A close relation of the entropy with the Lebesgue space L_{p} and the dual L_{q} is explored, in which the escort distribution associates with an interesting property. When we consider maximum Tsallis entropy distributions under the constraints of the mean vector and variance matrix, the model becomes a multivariate q -Gaussian model with elliptical contours, including a Gaussian and t-distribution model. We discuss the statistical estimation by minimization of the empirical loss associated with the projective power entropy. It is shown that the minimum loss estimator for the mean vector and variance matrix under the maximum entropy model are the sample mean vector and the sample variance matrix. The escort distribution of the maximum entropy distribution plays the key role for the derivation.

p. 1694-1707
Received: 1 August 2011 / Revised: 5 September 2011 / Accepted: 8 September 2011 / Published: 14 September 2011

Show/Hide Abstract
| Cited by 6 | PDF Full-text (385 KB) | HTML Full-text | XML Full-text
Abstract: Mutual information is one of the mostly used measures for evaluating image similarity. In this paper, we investigate the application of three different Tsallis-based generalizations of mutual information to analyze the similarity between scanned documents. These three generalizations derive from the Kullback–Leibler distance, the difference between entropy and conditional entropy, and the Jensen–Tsallis divergence, respectively. In addition, the ratio between these measures and the Tsallis joint entropy is analyzed. The performance of all these measures is studied for different entropic indexes in the context of document classification and registration.

p. 1518-1532
Received: 17 May 2011 / Revised: 4 August 2011 / Accepted: 11 August 2011 / Published: 17 August 2011

Show/Hide Abstract
| Cited by 5 | PDF Full-text (2392 KB) | HTML Full-text | XML Full-text
Abstract: E.T. Jaynes, originator of the maximum entropy interpretation of statistical mechanics, emphasized that there is an inevitable trade-off between the conflicting requirements of robustness and accuracy for any inferencing algorithm. This is because robustness requires discarding of information in order to reduce the sensitivity to outliers. The principal of nonlinear statistical coupling, which is an interpretation of the Tsallis entropy generalization, can be used to quantify this trade-off. The coupled-surprisal, -ln_{κ} (p)≡-(p^{κ} -1)/κ , is a generalization of Shannon surprisal or the logarithmic scoring rule, given a forecast p of a true event by an inferencing algorithm. The coupling parameter κ=1-q , where q is the Tsallis entropy index, is the degree of nonlinear coupling between statistical states. Positive (negative) values of nonlinear coupling decrease (increase) the surprisal information metric and thereby biases the risk in favor of decisive (robust) algorithms relative to the Shannon surprisal (κ=0 ). We show that translating the average coupled-surprisal to an effective probability is equivalent to using the generalized mean of the true event probabilities as a scoring rule. The metric is used to assess the robustness, accuracy, and decisiveness of a fusion algorithm. We use a two-parameter fusion algorithm to combine input probabilities from N sources. The generalized mean parameter ‘alpha’ varies the degree of smoothing and raising to a power Ν^{β} with β between 0 and 1 provides a model of correlation.

p. 1267-1280
Received: 6 June 2011 / Revised: 21 June 2011 / Accepted: 6 July 2011 / Published: 11 July 2011

Show/Hide Abstract
| Cited by 19 | PDF Full-text (8015 KB) | HTML Full-text | XML Full-text
Abstract: Nonextensive statistics has been becoming a very useful tool to describe the complexity of dynamic systems. Recently, analysis of the magnitude distribution of earthquakes has been increasingly used in the context of nonextensivity. In the present paper, the nonextensive analysis of the southern California earthquake catalog was performed. The results show that the nonextensivity parameter q lies in the same range as obtained for other different seismic areas, thus suggesting a sort of universal character in the nonextensive interpretation of seismicity.

p. 1186-1199
Received: 18 May 2011 / Revised: 15 June 2011 / Accepted: 16 June 2011 / Published: 21 June 2011

Show/Hide Abstract
| Cited by 6 | PDF Full-text (140 KB) | HTML Full-text | XML Full-text
Abstract: We give two arguments why the thermodynamic entropy of non-extensive systems involves R´enyi’s entropy function rather than that of Tsallis. The first argument is that the temperature of the configurational subsystem of a mono-atomic gas is equal to that of the kinetic subsystem. The second argument is that the instability of the pendulum, which occurs for energies close to the rotation threshold, is correctly reproduced.

p. 841-859
Received: 2 March 2011 / Revised: 17 March 2011 / Accepted: 29 March 2011 / Published: 13 April 2011

Show/Hide Abstract
| Cited by 51 | PDF Full-text (3104 KB) | HTML Full-text | XML Full-text
Abstract: This paper proposes a global multi-level thresholding method for image segmentation. As a criterion for this, the traditional method uses the Shannon entropy, originated from information theory, considering the gray level image histogram as a probability distribution, while we applied the Tsallis entropy as a general information theory entropy formalism. For the algorithm, we used the artificial bee colony approach since execution of an exhaustive algorithm would be too time-consuming. The experiments demonstrate that: 1) the Tsallis entropy is superior to traditional maximum entropy thresholding, maximum between class variance thresholding, and minimum cross entropy thresholding; 2) the artificial bee colony is more rapid than either genetic algorithm or particle swarm optimization. Therefore, our approach is effective and rapid.

Select/unselect all

Displaying article 1-12

Export citation of selected articles as:
Plain Text
BibTeX
BibTeX (without abstracts)
Endnote
Endnote (without abstracts)
Tab-delimited
RIS