Special Issue "Symbolic Entropy Analysis and Its Applications II"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 31 August 2020.

Special Issue Editor

Dr. Raúl Alcaraz
Website
Guest Editor
Research Group in Electronic, Biomedical and Telecommunication Engineering, Universidad de Castilla-La Mancha, Campus Universitario s/n, 16071, Cuenca, Spain
Interests: entropy; complexity; information theory; information geometry; nonlinear dynamics; computational mathematics and statistics in medicine; biomedical time series analysis; cardiac signal processing
Special Issues and Collections in MDPI journals

Special Issue Information

Dear Colleagues,

Symbolic data analysis has received a great deal of attention over the last few years and has been applied to many research areas, including astrophysics and geophysics, biology and medicine, fluid flow, chemistry, mechanical systems, artificial intelligence, communication systems, and, recently, data mining and big data. A fundamental step in this methodology is the quantization of original data into a corresponding sequence of symbols. The resulting time series is then considered a transformed version of the original data, allowing to highlight its temporal information. Indeed, it has been proven that this symbolization procedure can notably improve signal-to-noise ratios in some noisy time series. Moreover, symbolic data analysis also makes communication and numerical computation more efficient and effective, compared with the processing of continuous-valued time series.

However, symbolization of a time series always involves information loss, and, hence, this process deserves special attention. This challenge, along with other problems associated with symbolic entropy analysis, has been addressed in the first volume of the Special Issue on “Symbolic Entropy Analysis and Its Applications”. Given its success, this second volume aims to compile key current research on novel symbolization approaches, as well as applications of this kind of analysis to reveal pioneering information from different types of time series. Hence, manuscripts dealing with these topics will be welcome.

Prof. Dr. Raúl Alcaraz Martínez
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Symbolic data analysis
  • Symbolization approaches
  • Symbolic entropy
  • Transfer entropy
  • Permutation entropy
  • Lempel–Ziv Complexity

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Permutation Entropy as a Measure of Information Gain/Loss in the Different Symbolic Descriptions of Financial Data
Entropy 2020, 22(3), 330; https://doi.org/10.3390/e22030330 - 13 Mar 2020
Abstract
Financial markets give a large number of trading opportunities. However, over-complicated systems make it very difficult to be effectively used by decision-makers. Volatility and noise present in the markets evoke a need to simplify the market picture derived for the decision-makers. Symbolic representation [...] Read more.
Financial markets give a large number of trading opportunities. However, over-complicated systems make it very difficult to be effectively used by decision-makers. Volatility and noise present in the markets evoke a need to simplify the market picture derived for the decision-makers. Symbolic representation fits in this concept and greatly reduces data complexity. However, at the same time, some information from the market is lost. Our motivation is to answer the question: What is the impact of introducing different data representation on the overall amount of information derived for the decision-maker? We concentrate on the possibility of using entropy as a measure of the information gain/loss for the financial data, and as a basic form, we assume permutation entropy with later modifications. We investigate different symbolic representations and compare them with classical data representation in terms of entropy. The real-world data covering the time span of 10 years are used in the experiments. The results and the statistical verification show that extending the symbolic description of the time series does not affect the permutation entropy values. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Figure 1

Open AccessArticle
Augmentation of Dispersion Entropy for Handling Missing and Outlier Samples in Physiological Signal Monitoring
Entropy 2020, 22(3), 319; https://doi.org/10.3390/e22030319 - 11 Mar 2020
Abstract
Entropy quantification algorithms are becoming a prominent tool for the physiological monitoring of individuals through the effective measurement of irregularity in biological signals. However, to ensure their effective adaptation in monitoring applications, the performance of these algorithms needs to be robust when analysing [...] Read more.
Entropy quantification algorithms are becoming a prominent tool for the physiological monitoring of individuals through the effective measurement of irregularity in biological signals. However, to ensure their effective adaptation in monitoring applications, the performance of these algorithms needs to be robust when analysing time-series containing missing and outlier samples, which are common occurrence in physiological monitoring setups such as wearable devices and intensive care units. This paper focuses on augmenting Dispersion Entropy (DisEn) by introducing novel variations of the algorithm for improved performance in such applications. The original algorithm and its variations are tested under different experimental setups that are replicated across heart rate interval, electroencephalogram, and respiratory impedance time-series. Our results indicate that the algorithmic variations of DisEn achieve considerable improvements in performance while our analysis signifies that, in consensus with previous research, outlier samples can have a major impact in the performance of entropy quantification algorithms. Consequently, the presented variations can aid the implementation of DisEn to physiological monitoring applications through the mitigation of the disruptive effect of missing and outlier samples. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Figure 1

Open AccessArticle
Optimized Dimensionality Reduction Methods for Interval-Valued Variables and Their Application to Facial Recognition
Entropy 2019, 21(10), 1016; https://doi.org/10.3390/e21101016 - 19 Oct 2019
Abstract
The center method, which was first proposed in Rev. Stat. Appl. 1997 by Cazes et al. and Stat. Anal. Data Mining 2011 by Douzal-Chouakria et al., extends the well-known Principal Component Analysis (PCA) method to particular types of symbolic objects that are characterized [...] Read more.
The center method, which was first proposed in Rev. Stat. Appl. 1997 by Cazes et al. and Stat. Anal. Data Mining 2011 by Douzal-Chouakria et al., extends the well-known Principal Component Analysis (PCA) method to particular types of symbolic objects that are characterized by multivalued interval-type variables. In contrast to classical data, symbolic data have internal variation. The authors who originally proposed the center method used the center of a hyper-rectangle in R m as a base point to carry out PCA, followed by the projection of all vertices of the hyper-rectangles as supplementary elements. Since these publications, the center point of the hyper-rectangle has typically been assumed to be the best point for the initial PCA. However, in this paper, we show that this is not always the case, if the aim is to maximize the variance of projections or minimize the squared distance between the vertices and their respective projections. Instead, we propose the use of an optimization algorithm that maximizes the variance of the projections (or that minimizes the distances between the squares of the vertices and their respective projections) and finds the optimal point for the initial PCA. The vertices of the hyper-rectangles are, then, projected as supplementary variables to this optimal point, which we call the “Best Point” for projection. For this purpose, we propose four new algorithms and two new theorems. The proposed methods and algorithms are illustrated using a data set comprised of measurements of facial characteristics from a study on facial recognition patterns for use in surveillance. The performance of our approach is compared with that of another procedure in the literature, and the results show that our symbolic analyses provide more accurate information. Our approach can be regarded as an optimization method, as it maximizes the explained variance or minimizes the squared distance between projections and the original points. In addition, the symbolic analyses generate more informative conclusions, compared with the classical analysis in which classical surrogates replace intervals. All the methods proposed in this paper can be executed in the RSDA package developed in R. Full article
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications II)
Show Figures

Figure 1

Back to TopTop