Next Article in Journal
Editorial: Entropy in Networked Control
Next Article in Special Issue
Convergence Rates for Empirical Estimation of Binary Classification Bounds
Previous Article in Journal
Mesoscopic Simulation of the (2 + 1)-Dimensional Wave Equation with Nonlinear Damping and Source Terms Using the Lattice Boltzmann BGK Model
Previous Article in Special Issue
Robust Inference after Random Projections via Hellinger Distance for Location-Scale Family
Open AccessEditorial

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

Department of Satistics and Operation Research, Faculty of Mathematics, Universidad Complutense de Madrid, 28040 Madrid, Spain
Entropy 2019, 21(4), 391; https://doi.org/10.3390/e21040391
Received: 28 March 2019 / Accepted: 9 April 2019 / Published: 11 April 2019
Note: In lieu of an abstract, this is an excerpt from the first page.

In the last decades the interest in statistical methods based on information measures and particularly in pseudodistances or divergences has grown substantially [...] View Full-Text
MDPI and ACS Style

Pardo, L. New Developments in Statistical Information Theory Based on Entropy and Divergence Measures. Entropy 2019, 21, 391. https://doi.org/10.3390/e21040391

AMA Style

Pardo L. New Developments in Statistical Information Theory Based on Entropy and Divergence Measures. Entropy. 2019; 21(4):391. https://doi.org/10.3390/e21040391

Chicago/Turabian Style

Pardo, Leandro. 2019. "New Developments in Statistical Information Theory Based on Entropy and Divergence Measures" Entropy 21, no. 4: 391. https://doi.org/10.3390/e21040391

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop