Next Article in Journal
Entropy Churn Metrics for Fault Prediction in Software Systems
Previous Article in Journal
Multifractality of Pseudo-Velocities and Seismic Quiescence Associated with the Tehuantepec M8.2 EQ
Article Menu
Issue 12 (December) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(12), 962; https://doi.org/10.3390/e20120962

Range Entropy: A Bridge between Signal Complexity and Self-Similarity

1
The Florey Institute of Neuroscience and Mental Health, Austin Campus, Heidelberg, VIC 3084, Australia
2
Faculty of Medicine, Dentistry and Health Sciences, The University of Melbourne, VIC 3010, Australia
3
Department of Electrical and Computer Engineering, Sultan Qaboos University, Muscat 123, Oman
4
Department of Neurology, Austin Health, Melbourne, VIC 3084, Australia
*
Author to whom correspondence should be addressed.
Received: 1 November 2018 / Revised: 3 December 2018 / Accepted: 6 December 2018 / Published: 13 December 2018
Full-Text   |   PDF [4501 KB, uploaded 13 December 2018]   |  
  |   Review Reports

Abstract

Approximate entropy (ApEn) and sample entropy (SampEn) are widely used for temporal complexity analysis of real-world phenomena. However, their relationship with the Hurst exponent as a measure of self-similarity is not widely studied. Additionally, ApEn and SampEn are susceptible to signal amplitude changes. A common practice for addressing this issue is to correct their input signal amplitude by its standard deviation. In this study, we first show, using simulations, that ApEn and SampEn are related to the Hurst exponent in their tolerance r and embedding dimension m parameters. We then propose a modification to ApEn and SampEn called range entropy or RangeEn. We show that RangeEn is more robust to nonstationary signal changes, and it has a more linear relationship with the Hurst exponent, compared to ApEn and SampEn. RangeEn is bounded in the tolerance r-plane between 0 (maximum entropy) and 1 (minimum entropy) and it has no need for signal amplitude correction. Finally, we demonstrate the clinical usefulness of signal entropy measures for characterisation of epileptic EEG data as a real-world example. View Full-Text
Keywords: approximate entropy; sample entropy; range entropy; complexity, self-similarity; Hurst exponent approximate entropy; sample entropy; range entropy; complexity, self-similarity; Hurst exponent
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Omidvarnia, A.; Mesbah, M.; Pedersen, M.; Jackson, G. Range Entropy: A Bridge between Signal Complexity and Self-Similarity. Entropy 2018, 20, 962.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top