Special Issue "Entropy Measures for Data Analysis II: Theory, Algorithms and Applications"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 31 July 2020.

Special Issue Editor

Prof. Dr. Karsten Keller

Guest Editor
Institut für Mathematik, Universität zu Lübeck, D-23562 Lübeck, Germany
Interests: data analysis; time series analysis; computational statistics; information theory; ergodic theory; automatic learning
Special Issues and Collections in MDPI journals

Special Issue Information

Dear Colleagues,

Entropies and entropy-like quantities are playing an increasing role in modern nonlinear data analysis. The fields of their application reach from diagnostics in physiology, for instance, electroencephalography (EEG), magnetoencephalography (MEG), and electrocardiography (ECG), to econophysics and engineering. During the last few years, classical concepts such as approximate entropy and sample entropy have been supplemented by new entropy measures, like permutation entropy and various variants of it. Recent developments are focused on multidimensional generalizations of the concepts with a special emphasis on the quantification of coupling between time series and system components behind them.

Many of the considered entropy-based concepts and approaches are not fully understood, and there is a need for systematically exploring what and how much information of a data set and the systems behind the entropy measures they contain. In this context, the use of entropy measures as features in learning theory and as an instrument of data science is of some special interest. Not surprisingly, special emphasis has to be placed on the development of fast and efficient algorithms for determining and dealing with entropy measures in the case of large and complex data sets.

Prof. Dr. Karsten Keller
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Related Special Issue

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Storage Space Allocation Strategy for Digital Data with Message Importance
Entropy 2020, 22(5), 591; https://doi.org/10.3390/e22050591 - 25 May 2020
Abstract
This paper mainly focuses on the problem of lossy compression storage based on the data value that represents the subjective assessment of users when the storage size is still not enough after the conventional lossless data compression. To this end, we transform this [...] Read more.
This paper mainly focuses on the problem of lossy compression storage based on the data value that represents the subjective assessment of users when the storage size is still not enough after the conventional lossless data compression. To this end, we transform this problem to an optimization, which pursues the least importance-weighted reconstruction error in data reconstruction within limited total storage size, where the importance is adopted to characterize the data value from the viewpoint of users. Based on it, this paper puts forward an optimal allocation strategy in the storage of digital data by the exponential distortion measurement, which can make rational use of all the storage space. In fact, the theoretical results show that it is a kind of restrictive water-filling. It also characterizes the trade-off between the relative weighted reconstruction error and the available storage size. Consequently, if a relatively small part of total data value is allowed to lose, this strategy will improve the performance of data compression. Furthermore, this paper also presents that both the users’ preferences and the special characteristics of data distribution can trigger the small-probability event scenarios where only a fraction of data can cover the vast majority of users’ interests. Whether it is for one of the reasons above, the data with highly clustered message importance is beneficial to compression storage. In contrast, from the perspective of optimal storage space allocation based on data value, the data with a uniform information distribution is incompressible, which is consistent with that in the information theory. Full article
Show Figures

Figure 1

Open AccessArticle
Regime-Switching Discrete ARMA Models for Categorical Time Series
Entropy 2020, 22(4), 458; https://doi.org/10.3390/e22040458 - 17 Apr 2020
Abstract
For the modeling of categorical time series, both nominal or ordinal time series, an extension of the basic discrete autoregressive moving-average (ARMA) models is proposed. It uses an observation-driven regime-switching mechanism, leading to the family of RS-DARMA models. After having discussed the stochastic [...] Read more.
For the modeling of categorical time series, both nominal or ordinal time series, an extension of the basic discrete autoregressive moving-average (ARMA) models is proposed. It uses an observation-driven regime-switching mechanism, leading to the family of RS-DARMA models. After having discussed the stochastic properties of RS-DARMA models in general, we focus on the particular case of the first-order RS-DAR model. This RS-DAR ( 1 ) model constitutes a parsimoniously parameterized type of Markov chain, which has an easy-to-interpret data-generating mechanism and may also handle negative forms of serial dependence. Approaches for model fitting are elaborated on, and they are illustrated by two real-data examples: the modeling of a nominal sequence from biology, and of an ordinal time series regarding cloudiness. For future research, one might use the RS-DAR ( 1 ) model for constructing parsimonious advanced models, and one might adapt techniques for smoother regime transitions. Full article
Show Figures

Figure 1

Back to TopTop