# A Review of Shannon and Differential Entropy Rate Estimation

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Entropy and Entropy Rate

**Definition 1.**

**Definition 2.**

**Definition 3.**

**Definition 4.**

**Definition 5.**

**Definition 6.**

**Theorem 1.**

## 3. Parametric Approaches

#### 3.1. Gaussian Processes

**Definition 7.**

**Σ**is the covariance matrix.

#### 3.1.1. Maximum Entropy Spectral Estimation

#### 3.1.2. Maximum Likelihood Spectral Estimation

#### 3.1.3. Non-Parametric Spectral Density Estimation

#### 3.2. Markov Processes

#### 3.2.1. Markov Chains

#### 3.2.2. Hidden Markov Models

#### 3.2.3. Other Markov Processes

#### 3.3. Renewal/Point Processes

## 4. Non-Parametric Approaches

#### 4.1. Discrete-Valued, Discrete-Time Entropy Rate Estimation

#### 4.2. Continuous-Valued, Discrete-Time Entropy Rate Estimation

#### 4.2.1. Approximate Entropy

#### 4.2.2. Sample Entropy

#### 4.2.3. Permutation Entropy

#### 4.2.4. Specific Entropy

## 5. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Conflicts of Interest

## References

- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J.
**1948**, 27, 379–423. [Google Scholar] [CrossRef] [Green Version] - Verdú, S. Empirical Estimation of Information Measures: A Literature Guide. Entropy
**2019**, 21, 720. [Google Scholar] [CrossRef] [Green Version] - Amigó, J.M.; Balogh, S.G.; Hernández, S. A brief review of generalized entropies. Entropy
**2018**, 20, 813. [Google Scholar] [CrossRef] [Green Version] - Contreras Rodríguez, L.; Madarro-Capó, E.J.; Legón-Pérez, C.M.; Rojas, O.; Sosa-Gómez, G. Selecting an Effective Entropy Estimator for Short Sequences of Bits and Bytes with Maximum Entropy. Entropy
**2021**, 23, 561. [Google Scholar] [CrossRef] - Al-Babtain, A.A.; Elbatal, I.; Chesneau, C.; Elgarhy, M. Estimation of different types of entropies for the Kumaraswamy distribution. PLoS ONE
**2021**, 16, e0249027. [Google Scholar] [CrossRef] - Cox, D.R. Principles of Statistical Inference; Cambridge University Press: Cambridge, UK, 2006. [Google Scholar]
- Burg, J. Maximum Entropy Spectral Analysis, Paper Presented at the 37th Meeting; Society of Exploration Geophysics: Oklahoma City, OK, USA, 1967. [Google Scholar]
- Burg, J.P. Maximum Entropy Spectral Analysis; Stanford University: Stanford, CA, USA, 1975. [Google Scholar]
- Capon, J. Maximum-likelihood spectral estimation. In Nonlinear Methods of Spectral Analysis; Springer: Berlin/Heidelberg, Germany, 1983; pp. 155–179. [Google Scholar]
- Basharin, G.P. On a statistical estimate for the entropy of a sequence of independent random variables. Theory Probab. Appl.
**1959**, 4, 333–336. [Google Scholar] [CrossRef] - Ciuperca, G.; Girardin, V. On the estimation of the entropy rate of finite Markov chains. In Proceedings of the International Symposium on Applied Stochastic Models and Data Analysis, ENST Bretagne. Brest, France, 17–20 May 2005; pp. 1109–1117. [Google Scholar]
- Ciuperca, G.; Girardin, V. Estimation of the entropy rate of a countable Markov chain. Commun. Stat. Theory Methods
**2007**, 36, 2543–2557. [Google Scholar] [CrossRef] - Ciuperca, G.; Girardin, V.; Lhote, L. Computation and estimation of generalized entropy rates for denumerable Markov chains. IEEE Trans. Inf. Theory
**2011**, 57, 4026–4034. [Google Scholar] [CrossRef] [Green Version] - Kamath, S.; Verdú, S. Estimation of entropy rate and Rényi entropy rate for Markov chains. In Proceedings of the 2016 IEEE International Symposium on Information Theory (ISIT), Barcelona, Spain, 10–15 July 2016; pp. 685–689. [Google Scholar]
- Han, Y.; Jiao, J.; Lee, C.Z.; Weissman, T.; Wu, Y.; Yu, T. Entropy rate estimation for Markov chains with large state space. arXiv
**2018**, arXiv:1802.07889. [Google Scholar] - Chang, H.S. On convergence rate of the Shannon entropy rate of ergodic Markov chains via sample-path simulation. Stat. Probab. Lett.
**2006**, 76, 1261–1264. [Google Scholar] [CrossRef] - Yari, G.H.; Nikooravesh, Z. Estimation of the Entropy Rate of Ergodic Markov Chains. J. Iran. Stat. Soc.
**2012**, 11, 75–85. [Google Scholar] - Strelioff, C.C.; Crutchfield, J.P.; Hübler, A.W. Inferring Markov chains: Bayesian estimation, model comparison, entropy rate, and out-of-class modeling. Phys. Rev. E
**2007**, 76, 011106. [Google Scholar] [CrossRef] [Green Version] - Nair, C.; Ordentlich, E.; Weissman, T. Asymptotic filtering and entropy rate of a hidden Markov process in the rare transitions regime. In Proceedings of the International Symposium on Information Theory, Adelaide, SA, Australia, 4–9 September 2005; pp. 1838–1842. [Google Scholar]
- Ordentlich, E.; Weissman, T. Approximations for the entropy rate of a hidden Markov process. In Proceedings of the International Symposium on Information Theory, Adelaide, SA, Australia, 4–9 September 2005; pp. 2198–2202. [Google Scholar]
- Ordentlich, O. Novel lower bounds on the entropy rate of binary hidden Markov processes. In Proceedings of the 2016 IEEE International Symposium on Information Theory (ISIT), Barcelona, Spain, 10–15 July 2016; pp. 690–694. [Google Scholar]
- Luo, J.; Guo, D. On the entropy rate of hidden Markov processes observed through arbitrary memoryless channels. IEEE Trans. Inf. Theory
**2009**, 55, 1460–1467. [Google Scholar] - Gao, Y.; Kontoyiannis, I.; Bienenstock, E. Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study. Entropy
**2008**, 10, 71–99. [Google Scholar] [CrossRef] [Green Version] - Travers, N.F. Exponential Bounds for Convergence of Entropy Rate Approximations in Hidden Markov Models Satisfying a Path-Mergeability Condition. arXiv
**2014**, arXiv:math.PR/1211.6181. [Google Scholar] [CrossRef] - Peres, Y.; Quas, A. Entropy rate for hidden Markov chains with rare transitions. In Entropy of Hidden Markov Processes and Connections to Dynamical Systems: Papers from the Banff International Research Station Workshop; London Mathematical Society Lecture Note Series; Marcus, B., Petersen, K., Weissman, T., Eds.; Cambridge University Press: Cambridge, UK, 2011; pp. 172–178. [Google Scholar] [CrossRef] [Green Version]
- Regnault, P. Plug-in Estimator of the Entropy Rate of a Pure-Jump Two-State Markov Process. In AIP Conference Proceedings; American Institute of Physics: College Park, MD, USA, 2009; Volume 1193, pp. 153–160. [Google Scholar]
- Bartlett, M.S. On the theoretical specification and sampling properties of autocorrelated time-series. Suppl. J. R. Stat. Soc.
**1946**, 8, 27–41. [Google Scholar] [CrossRef] - Bartlett, M.S. Periodogram analysis and continuous spectra. Biometrika
**1950**, 37, 1–16. [Google Scholar] [CrossRef] - Tukey, J. The sampling theory of power spectrum estimates. Symposium on Applications of Autocorrelation Analysis to Physical Problems; US Office of Naval Research: Arlington, VA, USA, 1950; pp. 47–67.
- Grenander, U. On empirical spectral analysis of stochastic processes. Ark. Mat.
**1952**, 1, 503–531. [Google Scholar] [CrossRef] - Parzen, E. On choosing an estimate of the spectral density function of a stationary time series. Ann. Math. Stat.
**1957**, 28, 921–932. [Google Scholar] [CrossRef] - Parzen, E. On consistent estimates of the spectrum of a stationary time series. Ann. Math. Stat.
**1957**, 28, 329–348. [Google Scholar] [CrossRef] - Stoica, P.; Sundin, T. On nonparametric spectral estimation. Circuits Syst. Signal Process.
**1999**, 18, 169–181. [Google Scholar] [CrossRef] [Green Version] - Kim, Y.M.; Lahiri, S.N.; Nordman, D.J. Non-Parametric Spectral Density Estimation Under Long-Range Dependence. J. Time Ser. Anal.
**2018**, 39, 380–401. [Google Scholar] [CrossRef] - Lenk, P.J. Towards a practicable Bayesian nonparametric density estimator. Biometrika
**1991**, 78, 531–543. [Google Scholar] [CrossRef] - Carter, C.K.; Kohn, R. Semiparametric Bayesian Inference for Time Series with Mixed Spectra. J. R. Stat. Soc. Ser. B
**1997**, 59, 255–268. [Google Scholar] [CrossRef] - Gangopadhyay, A.; Mallick, B.; Denison, D. Estimation of spectral density of a stationary time series via an asymptotic representation of the periodogram. J. Stat. Plan. Inference
**1999**, 75, 281–290. [Google Scholar] [CrossRef] - Liseo, B.; Marinucci, D.; Petrella, L. Bayesian semiparametric inference on long-range dependence. Biometrika
**2001**, 88, 1089–1104. [Google Scholar] [CrossRef] - Choudhuri, N.; Ghosal, S.; Roy, A. Bayesian estimation of the spectral density of a time series. J. Am. Stat. Assoc.
**2004**, 99, 1050–1059. [Google Scholar] [CrossRef] [Green Version] - Edwards, M.C.; Meyer, R.; Christensen, N. Bayesian nonparametric spectral density estimation using B-spline priors. Stat. Comput.
**2019**, 29, 67–78. [Google Scholar] [CrossRef] [Green Version] - Tobar, F.; Bui, T.D.; Turner, R.E. Design of covariance functions using inter-domain inducing variables. In Proceedings of the NIPS 2015-Time Series Workshop, Montreal, QC, Canada, 11–15 December 2015. [Google Scholar]
- Tobar, F.; Bui, T.; Turner, R. Learning Stationary Time Series Using Gaussian Processes with Nonparametric Kernels. In Proceedings of the NIPS 2015-Time Series Workshop, Montreal, QC, Canada, 11–15 December 2015. [Google Scholar]
- Tobar, F. Bayesian nonparametric spectral estimation. arXiv
**2018**, arXiv:1809.02196. [Google Scholar] - Grassberger, P. Estimating the information content of symbol sequences and efficient codes. IEEE Trans. Inf. Theory
**1989**, 35, 669–675. [Google Scholar] [CrossRef] - Kontoyiannis, I.; Algoet, P.H.; Suhov, Y.M.; Wyner, A.J. Nonparametric entropy estimation for stationary processes and random fields, with applications to English text. IEEE Trans. Inf. Theory
**1998**, 44, 1319–1327. [Google Scholar] [CrossRef] - Quas, A.N. An entropy estimator for a class of infinite alphabet processes. Theory Probab. Appl.
**1999**, 43, 496–507. [Google Scholar] [CrossRef] - Kaltchenko, A.; Yang, E.H.; Timofeeva, N. Entropy estimators with almost sure convergence and an o (n-1) variance. In Proceedings of the 2007 IEEE Information Theory Workshop, Tahoe City, CA, USA, 2–6 September 2007; pp. 644–649. [Google Scholar]
- Kaltchenko, A.; Timofeeva, N. Rate of convergence of the nearest neighbor entropy estimator. AEU-Int. J. Electron. Commun.
**2010**, 64, 75–79. [Google Scholar] [CrossRef] - Vatutin, V.; Mikhailov, V. Statistical estimation of the entropy of discrete random variables with a large number of outcomes. Russ. Math. Surv.
**1995**, 50, 963. [Google Scholar] [CrossRef] - Timofeev, E. Statistical Estimation of measure invariants. St. Petersburg Math. J.
**2006**, 17, 527–551. [Google Scholar] [CrossRef] - Pincus, S.M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA
**1991**, 88, 2297–2301. [Google Scholar] [CrossRef] [Green Version] - Richman, J.S.; Moorman, J.R. Physiological time-series analysis using approximate entropy and sample entropy. Am. J. Physiol. Heart Circ. Physiol.
**2000**, 278, H2039–H2049. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Bandt, C.; Pompe, B. Permutation entropy: A natural complexity measure for time series. Phys. Rev. Lett.
**2002**, 88, 174102. [Google Scholar] [CrossRef] [PubMed] - Darmon, D. Specific differential entropy rate estimation for continuous-valued time series. Entropy
**2016**, 18, 190. [Google Scholar] [CrossRef] [Green Version] - Girardin, V.; Sesboüé, A. Asymptotic study of an estimator of the entropy rate of a two-state Markov chain for one long trajectory. In AIP Conference Proceedings; American Institute of Physics: College Park, MD, USA, 2006; Volume 872, pp. 403–410. [Google Scholar]
- Girardin, V.; Sesboüé, A. Comparative construction of plug-in estimators of the entropy rate of two-state Markov chains. Methodol. Comput. Appl. Probab.
**2009**, 11, 181–200. [Google Scholar] [CrossRef] - Rényi, A. On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics; University of California Press: Berkeley, CA, USA, 1961; pp. 547–561. [Google Scholar]
- Komaee, A. Mutual information rate between stationary Gaussian processes. Results Appl. Math.
**2020**, 7, 100107. [Google Scholar] [CrossRef] - Rice, J.A. Mathematical Statistics and Data Analysis, 3rd ed.; Duxbury Press: Belmont, CA, USA, 2006. [Google Scholar]
- Cramér, H. Mathematical Methods of Statistics; Princeton University Press: Princeton, NJ, USA, 1999; Volume 43. [Google Scholar]
- Box, G.E.; Jenkins, G.M.; Reinsel, G.C.; Ljung, G.M. Time Series Analysis: Forecasting and Control; John Wiley & Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing); Wiley-Interscience: Hoboken, NJ, USA, 2006. [Google Scholar]
- Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev.
**1957**, 106, 620. [Google Scholar] [CrossRef] - Choi, B.; Cover, T.M. An information-theoretic proof of Burg’s maximum entropy spectrum. Proc. IEEE
**1984**, 72, 1094–1096. [Google Scholar] [CrossRef] - Franke, J. ARMA processes have maximal entropy among time series with prescribed autocovariances and impulse responses. Adv. Appl. Probab.
**1985**, 17, 810–840. [Google Scholar] [CrossRef] - Franke, J. A Levinson-Durbin recursion for autoregressive-moving average processes. Biometrika
**1985**, 72, 573–581. [Google Scholar] [CrossRef] - Feutrill, A.; Roughan, M. Differential Entropy Rate Characterisations of Long Range Dependent Processes. arXiv
**2021**, arXiv:2102.05306. [Google Scholar] - Landau, H.J. Maximum entropy and maximum likelihood in spectral estimation. IEEE Trans. Inf. Theory
**1998**, 44, 1332–1336. [Google Scholar] [CrossRef] - Rezaeian, M. Hidden Markov process: A new representation, entropy rate and estimation entropy. arXiv
**2006**, arXiv:0606114. [Google Scholar] - Jacquet, P.; Seroussi, G.; Szpankowski, W. On the entropy of a hidden Markov process. SAIL—String Algorithms, Information and Learning: Dedicated to Professor Alberto Apostolico on the occasion of his 60th birthday. Theor. Comput. Sci.
**2008**, 395, 203–219. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Egner, S.; Balakirsky, V.; Tolhuizen, L.; Baggen, S.; Hollmann, H. On the entropy rate of a hidden Markov model. In Proceedings of the International Symposium on Information Theory, Chicago, IL, USA, 27 June–2 July 2004. [Google Scholar] [CrossRef]
- Ephraim, Y.; Merhav, N. Hidden Markov Processes. IEEE Trans. Inf. Theory
**2002**, 48, 1518–1569. [Google Scholar] [CrossRef] [Green Version] - Han, G.; Marcus, B. Analyticity of Entropy Rate of Hidden Markov Chains. IEEE Trans. Inf. Theory
**2006**, 52, 5251–5266. [Google Scholar] [CrossRef] [Green Version] - Zuk, O.; Kanter, I.; Domany, E. The entropy of a binary hidden Markov process. J. Stat. Phys.
**2005**, 121, 343–360. [Google Scholar] [CrossRef] [Green Version] - Zuk, O.; Domany, E.; Kanter, I.; Aizenman, M. Taylor series expansions for the entropy rate of Hidden Markov Processes. In Proceedings of the 2006 IEEE International Conference on Communications, Istanbul, Turkey, 11–15 June 2006; Volume 4, pp. 1598–1604. [Google Scholar] [CrossRef] [Green Version]
- Yari, G.; Nikooravesh, Z. Taylor Expansion for the Entropy Rate of Hidden Markov Chains. J. Stat. Res. Iran
**2011**, 7, 103–120. [Google Scholar] [CrossRef] [Green Version] - Dumitrescu, M.E. Some informational properties of Markov pure-jump processes. Časopis Pěstování Mat.
**1988**, 113, 429–434. [Google Scholar] [CrossRef] - Gibbons, J.D.; Chakraborti, S. Nonparametric Statistical Inference: Revised and Expanded; CRC Press: Boca Raton, FL, USA, 2014. [Google Scholar]
- Beirlant, J.; Dudewicz, E.J.; Györfi, L.; Van der Meulen, E.C. Nonparametric entropy estimation: An overview. Int. J. Math. Stat. Sci.
**1997**, 6, 17–39. [Google Scholar] - Bouzebda, S.; Elhattab, I. Uniform-in-bandwidth consistency for kernel-type estimators of Shannon’s entropy. Electron. J. Stat.
**2011**, 5, 440–459. [Google Scholar] [CrossRef] - Ziv, J.; Lempel, A. A universal algorithm for sequential data compression. IEEE Trans. Inf. Theory
**1977**, 23, 337–343. [Google Scholar] [CrossRef] [Green Version] - Wyner, A.D.; Ziv, J. Some asymptotic properties of the entropy of a stationary ergodic data source with applications to data compression. IEEE Trans. Inf. Theory
**1989**, 35, 1250–1258. [Google Scholar] [CrossRef] - Ornstein, D.S.; Weiss, B. Entropy and data compression schemes. IEEE Trans. Inf. Theory
**1993**, 39, 78–83. [Google Scholar] [CrossRef] - Shields, P.C. Entropy and Prefixes. Ann. Probab.
**1992**, 20, 403–409. [Google Scholar] [CrossRef] - Kontoyiannis, I.; Soukhov, I. Prefixes and the entropy rate for long-range sources. In Proceedings of the 1994 IEEE International Symposium on Information Theory, Trondheim, Norway, 27 June–1 July 1994. [Google Scholar] [CrossRef]
- Dobrushin, R. A simplified method of experimental estimation of the entropy of a stationary distribution. Engl. Trans. Theory Probab. Appl.
**1958**, 3, 462–464. [Google Scholar] - Bandt, C.; Shiha, F. Order patterns in time series. J. Time Ser. Anal.
**2007**, 28, 646–665. [Google Scholar] [CrossRef] - Alcaraz, R.; Rieta, J.J. A review on sample entropy applications for the non-invasive analysis of atrial fibrillation electrocardiograms. Biomed. Signal Process. Control.
**2010**, 5, 1–14. [Google Scholar] [CrossRef] - Chen, X.; Solomon, I.C.; Chon, K.H. Comparison of the use of approximate entropy and sample entropy: Applications to neural respiratory signal. In Proceedings of the 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, Shanghai, China, 17–18 January 2006; pp. 4212–4215. [Google Scholar]
- Lake, D.E.; Richman, J.S.; Griffin, M.P.; Moorman, J.R. Sample entropy analysis of neonatal heart rate variability. Am. J. Physiol. Regul. Integr. Comp. Physiol.
**2002**, 283, R789–R797. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Eckmann, J.P.; Ruelle, D. Ergodic theory of chaos and strange attractors. In The Theory of Chaotic Attractors; Springer: Berlin/Heidelberg, Germany, 1985; pp. 273–312. [Google Scholar]
- Delgado-Bonal, A.; Marshak, A. Approximate Entropy and Sample Entropy: A Comprehensive Tutorial. Entropy
**2019**, 21, 541. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Yentes, J.M.; Hunt, N.; Schmid, K.K.; Kaipust, J.P.; McGrath, D.; Stergiou, N. The appropriate use of approximate entropy and sample entropy with short data sets. Ann. Biomed. Eng.
**2013**, 41, 349–365. [Google Scholar] [CrossRef] - Udhayakumar, R.K.; Karmakar, C.; Palaniswami, M. Approximate entropy profile: A novel approach to comprehend irregularity of short-term HRV signal. Nonlinear Dyn.
**2017**, 88, 823–837. [Google Scholar] [CrossRef] - Durrett, R. Probability: Theory and Examples, 4th ed.; Cambridge University Press: New York, NY, USA, 2010. [Google Scholar]
- Darmon, D. Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data. Phys. Rev. E
**2018**, 97, 032206. [Google Scholar] [CrossRef] [Green Version] - Contreras-Reyes, J.E. Fisher information and uncertainty principle for skew-gaussian random variables. Fluct. Noise Lett.
**2021**, 20, 2150039. [Google Scholar] [CrossRef] - Golshani, L.; Pasha, E. Rényi entropy rate for Gaussian processes. Inf. Sci.
**2010**, 180, 1486–1491. [Google Scholar] [CrossRef]

**Table 1.**Comparison of entropy rate estimation techniques into categories based on parametric/non-parametric techniques. The modelling estimate refers to the quantity that is estimated in the technique and the entropy rate estimate refers to the full entropy rate expression used. For example, if estimating entropy rate of a Markov chain using plug-in estimation. Then, the modelling estimates may be non-parametric for the transition probabilities, ${p}_{ij}$ and the stationary distribution, ${\pi}_{j}$. However, the entropy rate estimator is a parametric estimator for the Markov model. Hence, there are no non-parametric/parametric estimators because non-parametric entropy estimators do not use a model.

Modelling Estimate | ||
---|---|---|

Entropy Rate Estimate | Parametric | Non-Parametric |

Parametric | [7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26] | [27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43] |

Non-Parametric | N/A | [44,45,46,47,48,49,50,51,52,53,54] |

**Table 2.**Comparison of entropy rate estimation techniques. They are partitioned into four categories based whether they are discrete or continuous time, and whether they work on discrete or continuous-valued data.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Feutrill, A.; Roughan, M.
A Review of Shannon and Differential Entropy Rate Estimation. *Entropy* **2021**, *23*, 1046.
https://doi.org/10.3390/e23081046

**AMA Style**

Feutrill A, Roughan M.
A Review of Shannon and Differential Entropy Rate Estimation. *Entropy*. 2021; 23(8):1046.
https://doi.org/10.3390/e23081046

**Chicago/Turabian Style**

Feutrill, Andrew, and Matthew Roughan.
2021. "A Review of Shannon and Differential Entropy Rate Estimation" *Entropy* 23, no. 8: 1046.
https://doi.org/10.3390/e23081046