Open AccessThis article is
- freely available
Interval Entropy and Informative Distance
Department of Statistics, Science and Research Branch, Islamic Azad University, Tehran, 14778-93855, Iran
School of Mathematics, Iran University of Science and Technology, Tehran, 16846-13114, Iran
* Author to whom correspondence should be addressed.
Received: 20 December 2011; in revised form: 4 February 2012 / Accepted: 7 February 2012 / Published: 2 March 2012
Abstract: The Shannon interval entropy function as a useful dynamic measure of uncertainty for two sided truncated random variables has been proposed in the literature of reliability. In this paper, we show that interval entropy can uniquely determine the distribution function. Furthermore, we propose a measure of discrepancy between two lifetime distributions at the interval of time in base of Kullback-Leibler discrimination information. We study various properties of this measure, including its connection with residual and past measures of discrepancy and interval entropy, and we obtain its upper and lower bounds.
Keywords: uncertainty; discrepancy; characterization
Article StatisticsClick here to load and display the download statistics.
Notes: Multiple requests from the same IP address are counted as one view.
Cite This Article
MDPI and ACS Style
Misagh, F.; Yari, G. Interval Entropy and Informative Distance. Entropy 2012, 14, 480-490.
Misagh F, Yari G. Interval Entropy and Informative Distance. Entropy. 2012; 14(3):480-490.
Misagh, Fakhroddin; Yari, Gholamhossein. 2012. "Interval Entropy and Informative Distance." Entropy 14, no. 3: 480-490.