entropy-logo

Journal Browser

Journal Browser

Multientropy Approaches: Combining Different Entropy Measures to Exploit Possible Synergies in Time Series Classification

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: closed (31 March 2020) | Viewed by 6988

Special Issue Editor


E-Mail Website
Guest Editor
Technological Institute of Informatics, Universitat Politècnica de València, Alcoi Campus, 03801 Alcoi, Spain
Interests: pattern recognition; machine learning; time series classification; nonlinear signal processing; sensor networks; medical devices

Special Issue Information

Dear Colleagues,

A myriad entropy measures have been proposed in the last years. Thousands of studies have used them successfully for time series classification. However, there are still applications where classification performance is still poor or not significant in comparison with the results obtained employing other feature extraction techniques.

In this regard, many improvements to the current methods have also been suggested: more robustness to noise, better sensitivity to time series differences, less dependence on input parameters or number of data, multiscale approaches, etc.

A line of research that has not been exploited so profusely yet is using more than a single entropy measure simultaneously. The rationale of this approach is the fact that the current state of the art of entropy measures includes algorithms based on histograms in terms of amplitude patterns, ordinal patterns, symbolic patterns, and even sorting patterns. Therefore, it can be hypothesized that different patterns look at different aspects of the time series dynamics, and a combination of them, as parameters of a single model, or by means of an algorithmic integration, can provide a more complete picture of the underlying structure of the time series.

In this Special Issue, we would like to collect papers focusing on the simultaneous application of more than one entropy measure for time series classification, not just for performance comparative purposes, but to exploit possible synergies and improve the classification performance using a single measure in an isolated manner. Any kind of entropy measure and of time series can be considered.

The main topics of this Special Issue include (but are not limited to) theoretical developments and practical applications with regard to:

  • Time series classification using statistical models or equations including more than one entropy measure;
  • Multimeasure algorithm integration. New entropy methods based on the evolution of current methods by merging different approaches in a single algorithm. 

Dr. David Cuesta-Frau
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • entropy
  • amplitude patterns
  • ordinal patterns
  • symbolic patterns
  • sorting patterns
  • classification
  • clustering
  • machine learning
  • pattern recognition

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 399 KiB  
Article
Using the Information Provided by Forbidden Ordinal Patterns in Permutation Entropy to Reinforce Time Series Discrimination Capabilities
by David Cuesta-Frau
Entropy 2020, 22(5), 494; https://doi.org/10.3390/e22050494 - 25 Apr 2020
Cited by 7 | Viewed by 2116
Abstract
Despite its widely tested and proven usefulness, there is still room for improvement in the basic permutation entropy (PE) algorithm, as several subsequent studies have demonstrated in recent years. Some of these new methods try to address the well-known PE weaknesses, such as [...] Read more.
Despite its widely tested and proven usefulness, there is still room for improvement in the basic permutation entropy (PE) algorithm, as several subsequent studies have demonstrated in recent years. Some of these new methods try to address the well-known PE weaknesses, such as its focus only on ordinal and not on amplitude information, and the possible detrimental impact of equal values found in subsequences. Other new methods address less specific weaknesses, such as the PE results’ dependence on input parameter values, a common problem found in many entropy calculation methods. The lack of discriminating power among classes in some cases is also a generic problem when entropy measures are used for data series classification. This last problem is the one specifically addressed in the present study. Toward that purpose, the classification performance of the standard PE method was first assessed by conducting several time series classification tests over a varied and diverse set of data. Then, this performance was reassessed using a new Shannon Entropy normalisation scheme proposed in this paper: divide the relative frequencies in PE by the number of different ordinal patterns actually found in the time series, instead of by the theoretically expected number. According to the classification accuracy obtained, this last approach exhibited a higher class discriminating power. It was capable of finding significant differences in six out of seven experimental datasets—whereas the standard PE method only did in four—and it also had better classification accuracy. It can be concluded that using the additional information provided by the number of forbidden/found patterns, it is possible to achieve a higher discriminating power than using the classical PE normalisation method. The resulting algorithm is also very similar to that of PE and very easy to implement. Full article
Show Figures

Figure 1

22 pages, 714 KiB  
Article
Slope Entropy: A New Time Series Complexity Estimator Based on Both Symbolic Patterns and Amplitude Information
by David Cuesta-Frau
Entropy 2019, 21(12), 1167; https://doi.org/10.3390/e21121167 - 28 Nov 2019
Cited by 45 | Viewed by 4525
Abstract
The development of new measures and algorithms to quantify the entropy or related concepts of a data series is a continuous effort that has brought many innovations in this regard in recent years. The ultimate goal is usually to find new methods with [...] Read more.
The development of new measures and algorithms to quantify the entropy or related concepts of a data series is a continuous effort that has brought many innovations in this regard in recent years. The ultimate goal is usually to find new methods with a higher discriminating power, more efficient, more robust to noise and artifacts, less dependent on parameters or configurations, or any other possibly desirable feature. Among all these methods, Permutation Entropy (PE) is a complexity estimator for a time series that stands out due to its many strengths, with very few weaknesses. One of these weaknesses is the PE’s disregarding of time series amplitude information. Some PE algorithm modifications have been proposed in order to introduce such information into the calculations. We propose in this paper a new method, Slope Entropy (SlopEn), that also addresses this flaw but in a different way, keeping the symbolic representation of subsequences using a novel encoding method based on the slope generated by two consecutive data samples. By means of a thorough and extensive set of comparative experiments with PE and Sample Entropy (SampEn), we demonstrate that SlopEn is a very promising method with clearly a better time series classification performance than those previous methods. Full article
Show Figures

Figure 1

Back to TopTop