Next Article in Journal
Editorial: Entropy in Networked Control
Previous Article in Journal
Mesoscopic Simulation of the (2 + 1)-Dimensional Wave Equation with Nonlinear Damping and Source Terms Using the Lattice Boltzmann BGK Model
Previous Article in Special Issue
Robust Inference after Random Projections via Hellinger Distance for Location-Scale Family
Article Menu
Issue 4 (April) cover image

Export Article

Open AccessEditorial
Entropy 2019, 21(4), 391; https://doi.org/10.3390/e21040391

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

Department of Satistics and Operation Research, Faculty of Mathematics, Universidad Complutense de Madrid, 28040 Madrid, Spain
Received: 28 March 2019 / Accepted: 9 April 2019 / Published: 11 April 2019
  |  
PDF [205 KB, uploaded 11 April 2019]
Note: In lieu of an abstract, this is an excerpt from the first page.

Excerpt

In the last decades the interest in statistical methods based on information measures and particularly in pseudodistances or divergences has grown substantially [...] View Full-Text
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Pardo, L. New Developments in Statistical Information Theory Based on Entropy and Divergence Measures. Entropy 2019, 21, 391.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top