New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
Department of Satistics and Operation Research, Faculty of Mathematics, Universidad Complutense de Madrid, 28040 Madrid, Spain
Entropy 2019, 21(4), 391; https://doi.org/10.3390/e21040391
Received: 28 March 2019 / Accepted: 9 April 2019 / Published: 11 April 2019
(This article belongs to the Special Issue New Developments in Statistical Information Theory Based on Entropy and Divergence Measures)
Note: In lieu of an abstract, this is an excerpt from the first page.
In the last decades the interest in statistical methods based on information measures and particularly in pseudodistances or divergences has grown substantially [...] View Full-Text
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited
MDPI and ACS Style
Pardo, L. New Developments in Statistical Information Theory Based on Entropy and Divergence Measures. Entropy 2019, 21, 391. https://doi.org/10.3390/e21040391
AMA Style
Pardo L. New Developments in Statistical Information Theory Based on Entropy and Divergence Measures. Entropy. 2019; 21(4):391. https://doi.org/10.3390/e21040391
Chicago/Turabian StylePardo, Leandro. 2019. "New Developments in Statistical Information Theory Based on Entropy and Divergence Measures" Entropy 21, no. 4: 391. https://doi.org/10.3390/e21040391
Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.
Search more from Scilit