Parameter Estimation of Lindley Distribution Based on Progressive TypeII Censored Competing Risks Data with Binomial Removals
Abstract
:1. Introduction
1.1. Lindley Distribution
1.2. Progressive TypeII Censored Data with Binomial Removals
1.3. Competing Risks
2. Formatting of Mathematical Components
2.1. Maximum Likelihood Estimation
2.2. Bayesian Estimation
3. Simulation Study
Algorithm 1 Generating the progressively type II censored samples with competing risks. 

 (1)
 Under the same loss function measure, the Bias and MSE of the informative Bayesian estimates tend to be smaller than the Bias and MSE of the noninformative Bayesian estimates.
 (2)
 If n is fixed, the other parameters are also fixed, adjusting the value of m, m increases, and the Bias and MSE of the estimation get smaller. In general, n increases and the Bias and MSE are smaller.
 (3)
 The Bias and MSE of Bayesian estimates are smaller than the Bias and MSE of maximum likelihood estimates based on SEL loss function and LL loss function with prior information, while EL loss function estimates do not have such an obvious trend.
 (1)
 The selection of prior information is very important. Under different loss function measures, the selection of optimal prior distribution is different. Under SEL, the estimation is optimal for given informationI priori information, while the estimation error is the largest for given informationII priori information. Under EL, the estimation is optimal for given prior information of informativeII, and the estimation error is maximum for given prior information of informativeI. Under LL, given prior information of informativeII, the estimation is optimal.
 (2)
 If n is fixed, the other parameters are also fixed, with the increase of m, the Bias and MSE of the estimates of parameter p become larger in most cases. Only when given the prior information of InformativeII, under EL loss function, the Bias and MSE get smaller with the increase of m if n and other variables are fixed.
 (3)
 As a whole, the Bias and MSE is getting smaller with the increase of n, but this is not evident under EL loss function.
 (4)
 Under LL, The Bias and MSE of the Bayesian estimates are smaller than maximum likelihood estimation under given prior information InformativeII. In other cases, the Bias and MSE of Bayesian estimation and maximum likelihood estimation are similar.
4. Data Analysis
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
 Lindley, D.V. Fiducial Distributions and Bayes Theorem. J. R. Stat. Soc. Ser. B Methodol. 1958, 20, 102–107. [Google Scholar] [CrossRef]
 Ghitany, M.E.; Atieh, B.; Nadarajah, S. Lindley distribution and its application. Math. Comput. Simul. 2008, 78, 493–506. [Google Scholar] [CrossRef]
 Bakouch, H.S.; AlZahrani, B.M.; AlShomrani, A.A.; Marchi, V.A.A.; Louzada, F. An extended Lindley distribution. J. Korean Stat. Soc. 2012, 41, 75–85. [Google Scholar] [CrossRef]
 Ghitany, M.E.; AlMutairi, D.K.; Balakrishnan, N.; AlEnezi, L.J. Power Lindley distribution and associated inference. Comput. Stat. Data Anal. 2013, 64, 20–33. [Google Scholar] [CrossRef]
 Nadarajah, S.; Bakouch, H.S.; Tahmasbi, R. A generalized Lindley distribution. Sankhya B 2011, 73, 331–359. [Google Scholar] [CrossRef]
 Ghitany, M.E.; AlMutairi, D.K.; Aboukhamseen, S.M. Estimation of the Reliability of a StressStrength System from Power Lindley Distributions. Commun. Stat. Simul. Comput. 2015, 44, 118–136. [Google Scholar] [CrossRef]
 GómezDéniz, E.; Sordo, M.A.; CalderínOjeda, E. The log–lindley distribution as an alternative to the beta regression model with applications in insurance. Insur. Math. Econ. 2014, 54, 49–57. [Google Scholar] [CrossRef]
 Pareek, B.; Kundu, D.; Kumar, S. On progressively censored competing risks data for Weibull distributions. Comput. Stat. Data Anal. 2009, 53, 4083–4094. [Google Scholar] [CrossRef]
 Doostparast, M.; Ahmadi, M.V.; Ahmadi, J. Bayes Estimation Based on Joint Progressive Type II Censored Data Under LINEX Loss Function. Commun. Stat. Simul. Comput. 2013, 42, 1865–1886. [Google Scholar] [CrossRef]
 AlHussaini, E.K.; AbdelHamid, A.H.; Hashem, A.F. Onesample Bayesian prediction intervals based on progressively typeII censored data from the halflogistic distribution under progressive stress model. Metrika 2015, 78, 771–783. [Google Scholar] [CrossRef]
 Chacko, M.; Mohan, R. Bayesian analysis of Weibull distribution based on progressive typeII censored competing risks data with binomial removals. Comput. Stat. 2019, 34, 233–252. [Google Scholar] [CrossRef]
 Kim, H.T. Cumulative Incidence in Competing Risks Data and Competing Risks Regression Analysis. Clin. Cancer Res. 2007, 13, 559–565. [Google Scholar] [CrossRef] [PubMed] [Green Version]
 Wenhua, H.; Gang, L.; Ning, L. A Bayesian approach to joint analysis of longitudinal measurements and competing risks failure time data. Stat. Med. 2010, 28, 1601–1619. [Google Scholar]
 Bakoyannis, G.; Touloumi, G. Practical methods for competing risks data: A review. Stat. Methods Med Res. 2012, 21, 257–272. [Google Scholar] [CrossRef] [PubMed]
 Austin, P.C.; Lee, D.S.; Fine, J.P. Introduction to the Analysis of Survival Data in the Presence of Competing Risks. Circulation 2016, 133, 601–609. [Google Scholar] [CrossRef] [PubMed]
 Sarhan, A.M.; Alameri, M.; AlWasel, I. Analysis of progressive censoring competing risks data with binomial removals. Int. J. Math. Anal. 2008, 2, 965–976. [Google Scholar]
 Lindley, D.V. Approximate Bayesian methods. Trabajos De Estadistica Y De Investigacion Operativa 1980, 31, 223–245. [Google Scholar] [CrossRef]
 Balakrishnan, N.; Cramer, E. Progressive TypeII Censoring: Distribution Theory; Springer: New York, NY, USA, 2014. [Google Scholar]
 Lawless, J.F. Statistical Models and Methods for Lifetime Data. Technometrics 1983, 25, 111–112. [Google Scholar]
${\mathit{\theta}}_{1}$  

$\mathit{p}$  $\mathit{n}$  $\mathit{m}$  SEL  EL  LL  ML  
Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  
InformativeI  NonInformative  InformativeI  NonInformative  InformativeI  NonInformative  
0.3  30  20  0.1896  0.0573  0.2022  0.0653  0.1976  0.0628  0.2051  0.0656  0.1868  0.0565  0.1950  0.0579  0.1977  0.0655 
25  0.1687  0.0453  0.1780  0.0511  0.1774  0.0484  0.1876  0.0529  0.1678  0.0452  0.1761  0.0496  0.1808  0.0515  
50  30  0.1586  0.0404  0.1655  0.0425  0.1562  0.0373  0.1726  0.0458  0.1605  0.0416  0.1678  0.0433  0.1608  0.0405  
40  0.1425  0.0316  0.1454  0.0329  0.1451  0.0323  0.1487  0.0342  0.1460  0.0331  0.1484  0.0333  0.1429  0.0314  
60  40  0.1375  0.0294  0.1509  0.0355  0.1481  0.0333  0.1435  0.0320  0.1406  0.0306  0.1509  0.0352  0.1466  0.0331  
50  0.1289  0.0254  0.1315  0.0261  0.1292  0.0254  0.1394  0.0295  0.1292  0.0256  0.1366  0.0287  0.1313  0.0260  
0.6  30  20  0.2011  0.0664  0.2102  0.0712  0.1857  0.0541  0.2049  0.0653  0.1886  0.0569  0.2076  0.0650  0.2094  0.0727 
25  0.1694  0.0459  0.1788  0.0482  0.1818  0.0501  0.1858  0.0526  0.1699  0.0463  0.1940  0.0590  0.1808  0.0503  
50  30  0.1620  0.0411  0.1664  0.0436  0.1615  0.0405  0.1703  0.0443  0.1655  0.0417  0.1716  0.0451  0.1663  0.0435  
40  0.1442  0.0322  0.1437  0.0320  0.1463  0.0324  0.1621  0.0389  0.1410  0.0307  0.1454  0.0316  0.1512  0.0352  
60  40  0.1364  0.0296  0.1465  0.0341  0.1503  0.0339  0.1472  0.0335  0.1388  0.0294  0.1464  0.0325  0.1521  0.0347  
50  0.1294  0.0261  0.1331  0.0269  0.1313  0.0268  0.1351  0.0279  0.1299  0.0252  0.1374  0.0292  0.1323  0.0261  
0.9  30  20  0.2011  0.0656  0.1999  0.0635  0.1955  0.0613  0.1977  0.0622  0.1886  0.0575  0.2101  0.0696  0.2097  0.0686 
25  0.1813  0.0511  0.1870  0.0546  0.1837  0.0515  0.1923  0.0578  0.1673  0.0447  0.1794  0.0496  0.1814  0.0498  
50  30  0.1592  0.0390  0.1619  0.0403  0.1639  0.0411  0.1800  0.0496  0.1583  0.0398  0.1673  0.0440  0.1755  0.0471  
40  0.1389  0.0300  0.1438  0.0315  0.1457  0.0323  0.1529  0.0349  0.1400  0.0304  0.1485  0.0342  0.1488  0.0337  
60  40  0.1418  0.0329  0.1484  0.0332  0.1476  0.0335  0.1468  0.0322  0.1439  0.0318  0.1484  0.0339  0.1433  0.0326  
50  0.1358  0.0276  0.1370  0.0279  0.1316  0.0267  0.1414  0.0301  0.1301  0.0259  0.1393  0.0290  0.1323  0.0267 
${\mathit{\theta}}_{2}$  

$\mathit{p}$  $\mathit{n}$  $\mathit{m}$  SEL  EL  LL  ML  
Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  
InformativeI  NonInformative  InformativeI  NonInformative  InformativeI  NonInformative  
0.3  30  20  0.2011  0.0653  0.2036  0.0646  0.2086  0.0687  0.2121  0.0697  0.1917  0.0595  0.2004  0.0616  0.1986  0.0619 
25  0.1707  0.0461  0.1877  0.0552  0.1945  0.0602  0.1908  0.0565  0.1765  0.0497  0.1857  0.0540  0.1841  0.0528  
50  30  0.1599  0.0408  0.1654  0.0431  0.1659  0.0439  0.1677  0.0448  0.1624  0.0411  0.1718  0.0459  0.1680  0.0429  
40  0.1407  0.0304  0.1523  0.0356  0.1465  0.0327  0.1532  0.0358  0.1415  0.0308  0.1479  0.0342  0.1450  0.0327  
60  40  0.1412  0.0318  0.1447  0.0329  0.1515  0.0358  0.1568  0.0372  0.1475  0.0334  0.1481  0.0331  0.1468  0.0329  
50  0.1309  0.0265  0.1313  0.0263  0.1308  0.0267  0.1444  0.0302  0.1349  0.0275  0.1326  0.0266  0.1331  0.0265  
0.6  30  20  0.1958  0.0650  0.1985  0.0608  0.2147  0.0745  0.2141  0.0744  0.1902  0.0595  0.2102  0.0703  0.2094  0.0687 
25  0.1809  0.0525  0.1819  0.0530  0.1877  0.0536  0.1977  0.0604  0.1811  0.0525  0.1880  0.0558  0.1833  0.0521  
50  30  0.1631  0.0413  0.1700  0.0443  0.1760  0.0480  0.1813  0.0528  0.1666  0.0433  0.1703  0.0431  0.1703  0.0445  
40  0.1395  0.0308  0.1526  0.0350  0.1459  0.0329  0.1575  0.0372  0.1389  0.0297  0.1480  0.0334  0.1450  0.0319  
60  40  0.1387  0.0301  0.1485  0.0339  0.1462  0.0334  0.1520  0.0358  0.1448  0.0325  0.1510  0.0349  0.1442  0.0320  
50  0.1274  0.0256  0.1359  0.0285  0.1326  0.0269  0.1422  0.0310  0.1310  0.0262  0.1402  0.0293  0.1374  0.0286  
0.9  30  20  0.1871  0.0581  0.1998  0.0644  0.2113  0.0705  0.2205  0.0747  0.2013  0.0633  0.2047  0.0641  0.2066  0.0695 
25  0.1744  0.0492  0.1793  0.0526  0.1830  0.0515  0.1984  0.0611  0.1868  0.0552  0.1914  0.0563  0.1830  0.0518  
50  30  0.1703  0.0451  0.1651  0.0419  0.1738  0.0462  0.1833  0.0519  0.1631  0.0414  0.1709  0.0458  0.1710  0.0451  
40  0.1436  0.0320  0.1468  0.0338  0.1498  0.0346  0.1548  0.0364  0.1447  0.0323  0.1517  0.0349  0.1523  0.0347  
60  40  0.1401  0.0312  0.1486  0.0340  0.1456  0.0338  0.1511  0.0356  0.1428  0.0312  0.1450  0.0325  0.1462  0.0317  
50  0.1241  0.0243  0.1335  0.0272  0.1365  0.0282  0.1409  0.0299  0.1289  0.0253  0.1402  0.0293  0.1294  0.0259 
p  

$\mathit{n}$  $\mathit{m}$  SEL  EL  LL  ML  
Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  Bias  MSE  
NonInformative  InformativeI  InformativeII  NonInformative  InformativeI  InformativeII  NonInformative  InformativeI  InformativeII  
30  20  0.1000  0.0164  0.0970  0.0157  0.1040  0.0176  0.1712  0.0313  0.1764  0.0334  0.1553  0.0259  0.0931  0.0145  0.1043  0.0178  0.0788  0.0100  0.0917  0.0136 
25  0.1453  0.0359  0.1370  0.0317  0.1609  0.0414  0.1763  0.0346  0.1880  0.0388  0.1497  0.0253  0.1266  0.0265  0.1548  0.0430  0.0960  0.0142  0.1328  0.0303  
50  30  0.0667  0.0076  0.0650  0.0069  0.0710  0.0082  0.1698  0.0300  0.1736  0.0312  0.1616  0.0271  0.0629  0.0066  0.0676  0.0075  0.0586  0.0054  0.0640  0.0067 
40  0.0984  0.0160  0.0921  0.0142  0.1119  0.0201  0.1704  0.0312  0.1759  0.0331  0.1584  0.0270  0.0936  0.0141  0.0963  0.0154  0.0788  0.0099  0.0941  0.0151  
60  40  0.0653  0.0073  0.0636  0.0066  0.0675  0.0077  0.1690  0.0297  0.1712  0.0305  0.1612  0.0271  0.0614  0.0061  0.0671  0.0073  0.0599  0.0059  0.0651  0.0067 
50  0.0974  0.0157  0.0934  0.0138  0.1022  0.0175  0.1720  0.0317  0.1771  0.0335  0.1564  0.0262  0.0904  0.0137  0.0935  0.0150  0.0768  0.0094  0.0915  0.0137 
Scheme 1 ($p=0.3$,$m=25$)  
${R}_{i}^{0}$  [3,3,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]  
Data  (11, 2), (329, 2), (1062, 2), (1594, 2), (1925, 1), (1990, 1), (2327, 2), (2400, 1), (2451, 2),  
(2471, 1), (2551, 1), (2568, 1), (2694, 1), (2702, 2), (2761, 2), (2831, 2), (3034, 1),(3059, 2),  
(3112, 1), (3214, 1), (3478, 1), (3504, 1), (4329, 1), (6976, 1), (7846, 1)  
Scheme 2 ($p=0.3$,$m=25$)  
${R}_{i}^{0}$  [2,5,2,1,1,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0]  
Data  (11, 2), (170, 2), (1167, 1),(1990, 1), (2327, 2), (2451, 2), (2471, 1), (2568, 1), (2694, 1),  
(2702, 2),(2831, 2),(3034, 1), (3059, 2), (3112, 1), (3214, 1), (3478, 1), (3504, 1), (4329, 1),  
(6976, 1), (7846, 1)  
Scheme 3 ($p=0.3$,$m=15$)  
${R}_{i}^{0}$  [6,4,2,2,0,1,2,0,0,0,0,0,1,0,0]  
Data  (11, 2), (958, 2), (1990, 1),(2400, 1), (2551, 1), (2568, 1), (2702, 2), (3034, 1), (3059, 2),  
(3112, 1), (3214, 1), (3478, 1), (3504, 1),(6976, 1), (7846, 1)  
Scheme 4 ($p=0.6$,$m=25$)  
${R}_{i}^{0}$  [7,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]  
Data  (11, 2), (1062, 2), (1167, 1), (1925, 1), (1990, 1), (2223, 1), (2327, 2), (2400, 1), (2451, 2),  
(2471, 1), (2551, 1), (2568, 1), (2694, 1), (2702, 2), (2761, 2), (2831, 2), (3034, 1), (3059, 2),  
(3112, 1), (3214, 1), (3478, 1), (3504, 1), (4329, 1), (6976, 1), (7846, 1)  
Scheme 5 ($p=0.6$,$m=20$)  
${R}_{i}^{0}$  [5,5,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0]  
Data  (11, 2), (708, 2), (1990, 1),(2400, 1), (2451, 2), (2471, 1), (2568, 1), (2694, 1), (2702, 2),  
(2761, 2), (2831, 2),(3034, 1), (3059, 2), (3112, 1), (3214, 1), (3478, 1), (3504, 1), (4329, 1),  
(6976, 1), (7846, 1)  
Scheme 6 ($p=0.6$,$m=15$)  
${R}_{i}^{0}$  [13,3,0,2,0,0,0,0,0,0,0,0,0,0,0]  
Data  (11, 2), (2327, 2), (2551, 1), (2568, 1), (2761, 2), (2831, 2), (3034, 1), (3059, 2), (3112, 1),  
(3214, 1), (3478, 1),(3504, 1), (4329, 1), (6976, 1), (7846, 1)  
Scheme 7 ($p=0.9$,$m=25$)  
${R}_{i}^{0}$  [8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]  
Data  (11, 2),(1167, 1), (1594, 2), (1925, 1), (1990, 1), (2223, 1), (2327, 2), (2400, 1), (2451, 2),  
(2471, 1), (2551, 1), (2568, 1), (2694, 1), (2702, 2), (2761, 2), (2831, 2), (3034, 1), (3059, 2),  
(3112, 1), (3214, 1), (3478, 1), (3504, 1), (4329, 1), (6976, 1), (7846, 1)  
Scheme 8 ($p=0.9$,$m=20$)  
${R}_{i}^{0}$  [12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]  
Data  (11, 2), (2223, 1), (2400, 1), (2451, 2), (2471, 1), (2551, 1), (2568, 1), (2694, 1), (2702, 2),  
(2761, 2), (2831, 2), (3034, 1), (3059, 2), (3112, 1), (3214, 1), (3478, 1), (3504, 1), (4329, 1),  
(6976, 1), (7846, 1)  
Scheme 9 ($p=0.9$,$m=15$)  
${R}_{i}^{0}$  [17,1,0,0,0,0,0,0,0,0,0,0,0,0,0]  
Data  (11, 2),(2551, 1),(2694, 1), (2702, 2), (2761, 2), (2831, 2), (3034, 1), (3059, 2), (3112, 1),  
(3214, 1),(3478, 1),(3504, 1), (4329, 1), (6976, 1), (7846, 1) 
SEL  EL  
Schemes  ${\mathit{\theta}}_{\mathbf{1}}\left({\mathbf{10}}^{\mathbf{4}}\right)$  ${\mathit{\theta}}_{\mathbf{2}}\left({\mathbf{10}}^{\mathbf{4}}\right)$  p  ${\mathit{\theta}}_{\mathbf{1}}\left({\mathbf{10}}^{\mathbf{4}}\right)$  ${\mathit{\theta}}_{\mathbf{2}}\left({\mathbf{10}}^{\mathbf{4}}\right)$  p 
1  4.7627  3.1069  0.4737  4.6212  2.8987  0.2963 
2  4.1513  2.3775  0.3111  4.0113  2.1496  0.2241 
3  4.1476  2.3738  0.3065  3.9891  2.1182  0.2250 
4  4.7648  2.3806  0.8182  4.6366  2.1436  0.4211 
5  4.1516  2.3777  0.5000  4.0106  2.1483  0.3171 
6  4.0876  2.3748  0.6786  3.9130  2.0987  0.3913 
7  4.7650  2.3808  1.0000  4.6365  2.1435  0.4706 
8  4.7640  2.3798  0.9333  4.6098  2.0993  0.4643 
9  4.1219  2.3782  0.9500  3.9384  2.0870  0.4737 
LL  ML  
Schemes  ${\mathit{\theta}}_{\mathbf{1}}\left({\mathbf{10}}^{\mathbf{4}}\right)$  ${\mathit{\theta}}_{\mathbf{2}}\left({\mathbf{10}}^{\mathbf{4}}\right)$  p  ${\mathit{\theta}}_{\mathbf{1}}\left({\mathbf{10}}^{\mathbf{4}}\right)$  ${\mathit{\theta}}_{\mathbf{2}}\left({\mathbf{10}}^{\mathbf{4}}\right)$  p 
1  4.7627  3.1069  0.4380  4.768372  3.112606  0.4444 
2  4.1513  2.3775  0.2932  4.15802  2.384186  0.2955 
3  4.1476  2.3738  0.2934  4.15802  2.384186  0.2951 
4  4.7647  2.3806  0.7925  4.768372  2.384186  0.8000 
5  4.1515  2.3777  0.4770  4.15802  2.384186  0.4815 
6  4.0876  2.3747  0.6627  4.097034  2.384186  0.6667 
7  4.7649  2.3807  1.0000  4.768372  2.384186  0.8889 
8  4.7639  2.3797  0.9263  4.768372  2.384186  0.9286 
9  4.1219  2.3782  0.9461  4.127871  2.384186  0.9474 
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nie, J.; Gui, W. Parameter Estimation of Lindley Distribution Based on Progressive TypeII Censored Competing Risks Data with Binomial Removals. Mathematics 2019, 7, 646. https://doi.org/10.3390/math7070646
Nie J, Gui W. Parameter Estimation of Lindley Distribution Based on Progressive TypeII Censored Competing Risks Data with Binomial Removals. Mathematics. 2019; 7(7):646. https://doi.org/10.3390/math7070646
Chicago/Turabian StyleNie, Jiaxin, and Wenhao Gui. 2019. "Parameter Estimation of Lindley Distribution Based on Progressive TypeII Censored Competing Risks Data with Binomial Removals" Mathematics 7, no. 7: 646. https://doi.org/10.3390/math7070646