entropy-logo

Journal Browser

Journal Browser

Approximate Entropy and Its Application

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: 15 August 2024 | Viewed by 1630

Special Issue Editor


E-Mail Website
Guest Editor
Department of Computer Science, University of Haifa, Haifa 3498838, Israel
Interests: monitoring functions over streams; regularization; applications of algebraic geometry in computer science

Special Issue Information

Dear Colleagues,

Entropy is a fundamental property of data, and a key metric in many scientific and engineering fields, such as signal processing, computer science, medicine, physics, and more. Entropy estimation has been extensively studied, but almost always under the assumption that data are centralized and static, or reside in a single data stream, seen in its entirety by one node running the estimation algorithm. However, multiple distributed data sources are becoming increasingly common, and novel algorithms are required, for example, to quickly detect a distributed denial of service attack, by approximating the global entropy over the nodes of a distributed servers, but without centralizing the data.

Contributions are solicited which address interesting theories and applications of estimating and approximating entropy in cases where the data are dynamic, distributed, noisy, partial, or any combination of the above. Additionally, of interest are cases in which the data were subject to some transformation, for example linear transformations, projections, compression, or coding. Contributions on Approximate Entropy are also welcome.

Prof. Dr. Daniel Keren
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • approximation and estimation of entropy from data which may be any combination of the following: massive, partial, noisy, streaming, distributed

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

34 pages, 1698 KiB  
Article
On the Accurate Estimation of Information-Theoretic Quantities from Multi-Dimensional Sample Data
by Manuel Álvarez Chaves, Hoshin V. Gupta, Uwe Ehret and Anneli Guthke
Entropy 2024, 26(5), 387; https://doi.org/10.3390/e26050387 - 30 Apr 2024
Viewed by 434
Abstract
Using information-theoretic quantities in practical applications with continuous data is often hindered by the fact that probability density functions need to be estimated in higher dimensions, which can become unreliable or even computationally unfeasible. To make these useful quantities more accessible, alternative approaches [...] Read more.
Using information-theoretic quantities in practical applications with continuous data is often hindered by the fact that probability density functions need to be estimated in higher dimensions, which can become unreliable or even computationally unfeasible. To make these useful quantities more accessible, alternative approaches such as binned frequencies using histograms and k-nearest neighbors (k-NN) have been proposed. However, a systematic comparison of the applicability of these methods has been lacking. We wish to fill this gap by comparing kernel-density-based estimation (KDE) with these two alternatives in carefully designed synthetic test cases. Specifically, we wish to estimate the information-theoretic quantities: entropy, Kullback–Leibler divergence, and mutual information, from sample data. As a reference, the results are compared to closed-form solutions or numerical integrals. We generate samples from distributions of various shapes in dimensions ranging from one to ten. We evaluate the estimators’ performance as a function of sample size, distribution characteristics, and chosen hyperparameters. We further compare the required computation time and specific implementation challenges. Notably, k-NN estimation tends to outperform other methods, considering algorithmic implementation, computational efficiency, and estimation accuracy, especially with sufficient data. This study provides valuable insights into the strengths and limitations of the different estimation methods for information-theoretic quantities. It also highlights the significance of considering the characteristics of the data, as well as the targeted information-theoretic quantity when selecting an appropriate estimation technique. These findings will assist scientists and practitioners in choosing the most suitable method, considering their specific application and available data. We have collected the compared estimation methods in a ready-to-use open-source Python 3 toolbox and, thereby, hope to promote the use of information-theoretic quantities by researchers and practitioners to evaluate the information in data and models in various disciplines. Full article
(This article belongs to the Special Issue Approximate Entropy and Its Application)
Show Figures

Figure 1

20 pages, 3609 KiB  
Article
A Novel Fault Diagnosis Method for a Power Transformer Based on Multi-Scale Approximate Entropy and Optimized Convolutional Networks
by Haikun Shang, Zhidong Liu, Yanlei Wei and Shen Zhang
Entropy 2024, 26(3), 186; https://doi.org/10.3390/e26030186 - 22 Feb 2024
Viewed by 822
Abstract
Dissolved gas analysis (DGA) in transformer oil, which analyzes its gas content, is valuable for promptly detecting potential faults in oil-immersed transformers. Given the limitations of traditional transformer fault diagnostic methods, such as insufficient gas characteristic components and a high misjudgment rate for [...] Read more.
Dissolved gas analysis (DGA) in transformer oil, which analyzes its gas content, is valuable for promptly detecting potential faults in oil-immersed transformers. Given the limitations of traditional transformer fault diagnostic methods, such as insufficient gas characteristic components and a high misjudgment rate for transformer faults, this study proposes a transformer fault diagnosis model based on multi-scale approximate entropy and optimized convolutional neural networks (CNNs). This study introduces an improved sparrow search algorithm (ISSA) for optimizing CNN parameters, establishing the ISSA-CNN transformer fault diagnosis model. The dissolved gas components in the transformer oil are analyzed, and the multi-scale approximate entropy of the gas content under different fault modes is calculated. The computed entropy values are then used as feature parameters for the ISSA-CNN model to derive diagnostic results. Experimental data analysis demonstrates that multi-scale approximate entropy effectively characterizes the dissolved gas components in the transformer oil, significantly improving the diagnostic efficiency. Comparative analysis with BPNN, ELM, and CNNs validates the effectiveness and superiority of the proposed ISSA-CNN diagnostic model across various evaluation metrics. Full article
(This article belongs to the Special Issue Approximate Entropy and Its Application)
Show Figures

Figure 1

Back to TopTop