Special Issue "Information Theoretic Measures and Their Applications"
A special issue of Entropy (ISSN 1099-4300).
Deadline for manuscript submissions: closed (30 August 2020).
Interests: time-series analysis; information theory; time–frequency transform; wavelet transform; entropy and complexity; non-linear dynamics and chaos; complex networks, medical and biological applications
Special Issues and Collections in MDPI journals
Special Issue in Entropy: Entropy and Electroencephalography II
Special Issue in Entropy: Permutation Entropy & Its Interdisciplinary Applications
Special Issue in Entropy: Information Theoretic Measures and Their Applications II
Interests: time-series analysis; information theory; brain and neuronal dynamics, neural coding; entropy and complexity; nonlinear dynamics and chaos; complex networks, medical and biological applications
Special Issues and Collections in MDPI journals
The concept of entropy, an ever-growing physical magnitude that measured the degree of decay of order in a physical system, was introduced by Rudolf Clausius in 1865 through an elegant formulation of the second law of thermodynamics. Seven years later, in 1872, Ludwig Boltzmann proved the famous H-theorem, showing that the quantity H always decreases in time, and in the case of perfect gas in equilibrium, the quantity H was related to Clausius’ entropy S. The dynamical approach of Boltzmann, together with the elegant theory of statistical ensembles at equilibrium proposed by Josiah Willard Gibbs, led to the Boltzmann–Gibbs theory of statistical mechanics, which represents one of the most successful theoretical frameworks of physics. In fact, with the introduction of entropy, thermodynamics became a model of theoretical science.
In 1948, Claude E. Shannon developed a “statistical theory of communication”, taking ideas from both logic and statistics that in turn opened new paths for research. The powerful notion of information entropy played a major part in the development of new statistical techniques, overhauling the Bayesian approach to probability and statistics. It provided powerful new techniques and approaches on several fields of science, extending and shedding new light on the field.
In the space of few decades, chaos theory has jumped from the scientific literature into the popular realm, being regarded as a new way of looking at complex systems like brains or ecosystems. It is believed that the theory manages to capture the disorganized order that pervades our world. Chaos theory is a facet of the complex systems paradigm having to do with determinism randomness. In 1959, Kolmogorov observed that Shannon’s probabilistic theory of information could be applied to symbolic encodings of the phase–space descriptions of physical nonlinear dynamical systems so that one might characterize a process in terms of its Kolmogorov–Sinai entropy. Pesin’s theorem in 1977 proved that for certain deterministic nonlinear dynamical systems exhibiting chaotic behavior, an estimation of the Kolmogorov–Sinai entropy is given by the sum of the positive Lyapunov exponents for the process. Thus, a nonlinear dynamical system may be viewed as an information source from which information-related quantifiers may help to characterize and visualize relevant details of the chaotic process.
The information content of a system is typically evaluated via a probability distribution function (PDF) P describing the apportionment of some measurable or observable quantity, generally a time series X(t). Quantifying the information content of a given observable is therefore largely tantamount to characterizing its probability distribution. This is often done with a wide family of measures called information theory quantifiers (i.e., Shannon entropy and mutual information, Fisher information, statistical complexity, etc.). Thus, information theory quantifiers are measures that are able to characterize the relevant properties of the PDF associated with these time series, and in this way, we should judiciously extract information on the dynamical system under study.
The evaluation of the information theory quantifiers supposes some prior knowledge about the system; specifically, a probability distribution associated to the time series under analysis should be provided beforehand. The determination of the most adequate PDF is a fundamental problem, because the PDF P and the sample space Ω are inextricably linked. Many methods have been proposed for a proper selection of the probability space (Ω, P). Among others, we can mention: Frequency counting; procedures based on amplitude statistics; binary symbolic dynamics; Fourier analysis; wavelet transform; and permutation patterns. The suitability of each of the proposed methodologies depends on the peculiarity of data, such as stationarity, length of the series, the variation of the parameters, the level of noise contamination, etc. In all these cases, global aspects of the dynamics can somehow be captured, but the different approaches are not equivalent in their ability to discern all relevant physical details.
Mutual information rigorously quantifies, in units known as “bits”, how much information the value of one variable reveals about the value of another. This is a dimensionless quantity that can be thought of as the reduction in uncertainty about one random variable given knowledge of another. Fisher information, which predates the Shannon entropy, and the more recent statistical complexities have also proved to be useful and powerful tools in different scenarios, allowing in particular to analyze time series and data series independently of their sources. The Fisher information measure can be variously interpreted as a measure of the ability to estimate a parameter, as the amount of information that can be extracted from a set of measurements, and also as a measure of the state of disorder of a system or phenomenon.
Among the most recent entropy proposals, we can mention approximate entropy; sample entropy; delayed permutation entropy and permutation min-entropy. That is, different methodologies have been used to understand the mechanisms behind information processing. Among those, there are also methods of frequency analysis like wavelet transform (WT), which distinguishes itself from others due to the high efficiency when dealing with feature extraction. The “wavelet analysis” is the appropriate mathematical tool to analyze signals in the time and frequency domain. All these measures have important applications not only in physics but also in quite distinct areas, such as biology, medicine, economy, cognitive sciences, numerical and computational sciences, bigdata analysis, complex networks, and neuroscience.
In summary, for the present Special Issue, manuscripts focused on any of the abovementioned “Information Theoretic Measures as Mutual Information, Permutation Entropy Approaches, Sample Entropy, Wavelet Entropy and its Evaluations”, as well as, its interdisciplinaries applications are more than welcome.
Dr. Osvaldo Anibal Rosso
Dr. Fernando Fabian Montani
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- Shannon entropy
- mutual information
- Fisher information
- statistical complexity
- information processing
- different PDF evaluations
- different dynamic states captured by information theoretical approaches