Entropy Estimation of Generalized Half-Logistic Distribution ( GHLD ) Based on Type-II Censored Samples

This paper derives the entropy of a generalized half-logistic distribution based on Type-II censored samples, obtains some entropy estimators by using Bayes estimators of an unknown parameter in the generalized half-logistic distribution based on Type-II censored samples and compares these estimators in terms of the mean squared error and the bias through Monte Carlo simulations.


Introduction
Shannon [1], who proposed information theory to quantify information loss, introduced statistical entropy.Several researchers have discussed inferences about entropy.Baratpour et al. [2] developed the entropy of upper record values and provided several upper and lower bounds for this entropy by using the hazard rate function.Abo-Eleneen [3] suggested an efficient method for the simple computation of entropy in progressively Type-II censored samples.Kang et al. [4] derived estimators for the entropy function of a double exponential distribution based on multiply Type-II censored samples by using maximum likelihood estimators (MLEs) and approximate MLEs (AMLEs).
Arora et al. [5] obtained the MLE of the shape parameter in a generalized half-logistic distribution (GHLD) based on Type-I progressive censoring with varying failure rates.Kim et al. [6] proposed Bayes estimators of the shape parameter and the reliability function for the GHLD based on progressively Type-II censored data under various loss functions.Seo et al. [7] developed an entropy estimation method for upper record values from the GHLD.Azimi [8] derived the Bayes estimators of the shape parameter and the reliability function for the GHLD based on Type-II doubly censored samples.
The cumulative distribution function (cdf) and probability density function (pdf) of the random variable, X, with the GHLD are, respectively: and: where λ is the shape parameter.As a special case, if λ = 1, then this is the half-logistic distribution.The rest of this paper is organized as follows: Section 2 develops an entropy estimation method with Bayes estimators of the shape parameter in the GHLD based on Type-II censored samples.Section 3 assesses the validity of the proposed method by using Monte Carlo simulations, and Section 4 concludes.

Entropy Estimation
Let X be a random variable with the cdf, F (x), and the pdf, f (x).Then, the differential entropy of X is defined by Cover and Thomas [9] as: Let X 1:n , X 2:n , • • • , X n:n be the order statistics of random samples X 1 , X 2 , • • • , X n from the continuous pdf, f (x).In the conventional Type-II censoring scheme, r is assumed to be known in advance, and the experiment is terminated as soon as the r-th item fails (r ≤ n). Park [10] provided the entropy of a continuous probability distribution based on Type-II censored samples in terms of the hazard function, h(x) = f (x)/(1 − F (x)), as: where f i:n (x) is the pdf of the i-th order statistic, X i:n .He also derived a nonparametric entropy estimator of Equation ( 4) as: where m is a positive integer smaller than r/2, which is referred to as window size; X i−m:n = X 1:n for i − m < 1 and X i+m:n = X r:n for i + m > n.
Theorem 1.The entropy of the GHLD based on Type-II censored samples is: where: Proof.Let u = 1 − F (x). Then: by using: Note that entropy Equation (6) depends on the shape parameter, λ.The following subsections estimate λ by using two Bayesian inference methods: one method makes use of a conjugate prior; the other, distributions of hyperparameters that are parameters of a prior.

Bayes Estimation
Theorem 2. A natural conjugate prior for the shape parameter, λ, is the gamma prior, given by: Then, the Bayes estimator of λ based on the square error loss function (SELF) is obtained as: where: Proof.The likelihood function based on the Type-II censoring scheme is written as: Then, it is easy to obtain the MLE of λ to be λ = r/h from Equation (13).
From gamma prior Equation ( 10) and likelihood function Equation ( 13), the posterior distribution of λ is obtained as: which is a gamma distribution with the parameters (r + α, β + h).Here, the Bayes estimator of λ based on the SELF is the posterior mean, and therefore, this completes the proof.

E-Bayesian Estimation
Han [11] introduced a new Bayesian estimation method referred to as the expected Bayesian (E-Bayesian) estimation method to estimate failure probability.The present paper uses this method to obtain another Bayes estimator of the shape parameter, λ.
In Subsection 2.1, the gamma prior is supposed to obtain the Bayes estimator of λ.Hence, according to Han [12], the following prior distributions for α and β are considered: which are decreasing, constant and increasing functions of β, respectively.

Table 4 .
MSEs (biases) of entropy estimators with the MLE and the Bayes estimators.