Next Article in Journal
Multi-Scale Permutation Entropy: A Potential Measure for the Impact of Sleep Medication on Brain Dynamics of Patients with Insomnia
Next Article in Special Issue
A Fast Approach to Removing Muscle Artifacts for EEG with Signal Serialization Based Ensemble Empirical Mode Decomposition
Previous Article in Journal
Inference for Inverse Power Lomax Distribution with Progressive First-Failure Censoring
Previous Article in Special Issue
Exploring Neurofeedback Training for BMI Power Augmentation of Upper Limbs: A Pilot Study
 
 
Article

Entropy Estimation Using a Linguistic Zipf–Mandelbrot–Li Model for Natural Sequences

School of Information Technology and Electrical Engineering, The University of Queensland, Brisbane, QLD 4072, Australia
*
Author to whom correspondence should be addressed.
Academic Editors: Sergio Cruces, Iván Durán-Díaz, Rubén Martín-Clemente and Andrzej Cichocki
Entropy 2021, 23(9), 1100; https://doi.org/10.3390/e23091100
Received: 15 July 2021 / Revised: 14 August 2021 / Accepted: 19 August 2021 / Published: 24 August 2021
(This article belongs to the Special Issue Theory and Applications of Information Processing Algorithms)
Entropy estimation faces numerous challenges when applied to various real-world problems. Our interest is in divergence and entropy estimation algorithms which are capable of rapid estimation for natural sequence data such as human and synthetic languages. This typically requires a large amount of data; however, we propose a new approach which is based on a new rank-based analytic Zipf–Mandelbrot–Li probabilistic model. Unlike previous approaches, which do not consider the nature of the probability distribution in relation to language; here, we introduce a novel analytic Zipfian model which includes linguistic constraints. This provides more accurate distributions for natural sequences such as natural or synthetic emergent languages. Results are given which indicates the performance of the proposed ZML model. We derive an entropy estimation method which incorporates the linguistic constraint-based Zipf–Mandelbrot–Li into a new non-equiprobable coincidence counting algorithm which is shown to be effective for tasks such as entropy rate estimation with limited data. View Full-Text
Keywords: entropy estimation; Zipf–Mandelbrot–Li law; language models; probabilistic natural sequences entropy estimation; Zipf–Mandelbrot–Li law; language models; probabilistic natural sequences
Show Figures

Figure 1

MDPI and ACS Style

Back, A.D.; Wiles, J. Entropy Estimation Using a Linguistic Zipf–Mandelbrot–Li Model for Natural Sequences. Entropy 2021, 23, 1100. https://doi.org/10.3390/e23091100

AMA Style

Back AD, Wiles J. Entropy Estimation Using a Linguistic Zipf–Mandelbrot–Li Model for Natural Sequences. Entropy. 2021; 23(9):1100. https://doi.org/10.3390/e23091100

Chicago/Turabian Style

Back, Andrew D., and Janet Wiles. 2021. "Entropy Estimation Using a Linguistic Zipf–Mandelbrot–Li Model for Natural Sequences" Entropy 23, no. 9: 1100. https://doi.org/10.3390/e23091100

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop