Entropy 2012, 14(3), 480-490; doi:10.3390/e14030480
Article

Interval Entropy and Informative Distance

1 Department of Statistics, Science and Research Branch, Islamic Azad University, Tehran, 14778-93855, Iran 2 School of Mathematics, Iran University of Science and Technology, Tehran, 16846-13114, Iran
* Author to whom correspondence should be addressed.
Received: 20 December 2011; in revised form: 4 February 2012 / Accepted: 7 February 2012 / Published: 2 March 2012
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)
PDF Full-text Download PDF Full-Text [119 KB, uploaded 5 March 2012 16:45 CET]
Abstract: The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds.
Keywords: uncertainty; discrepancy; characterization

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Misagh, F.; Yari, G. Interval Entropy and Informative Distance. Entropy 2012, 14, 480-490.

AMA Style

Misagh F, Yari G. Interval Entropy and Informative Distance. Entropy. 2012; 14(3):480-490.

Chicago/Turabian Style

Misagh, Fakhroddin; Yari, Gholamhossein. 2012. "Interval Entropy and Informative Distance." Entropy 14, no. 3: 480-490.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert