# A Type I Generalized Logistic Distribution: Solving Its Estimation Problems with a Bayesian Approach and Numerical Applications Based on Simulated and Engineering Data

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

`R`package named

`glogis`[18] was published without further discussion about this problem [19].

`R`package named

`lmom`[25] has been implemented for the L-moment method.

## 2. The Type I Generalized Logistic Distribution and Related Distributions

## 3. Inference: Classical Approaches

## 4. Inference: A Bayesian Approach

`R`,

`Python`,

`MatLab`, and

`C++`. Using the

`R`language for our simulated scenarios here, it was not possible to achieve reasonable results.

## 5. Simulation Studies

`glogisfit`function of the

`glogis`package [36].

`mipfp`package of

`R`without achieving reliable results. We do not show these results in the paper due to restrictions of space. Table 2 reports the results obtained for the estimates with the moment method and 1000 replicates for the size samples, cases, parameter, and indicators mentioned. These results are not satisfactory and, for the ML method, we do not report the results because they are totally unsatisfactory.

- Case 1: $\mathrm{E}\left({\mu}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})=0$, $\mathrm{Var}\left({\mu}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={({\mu}^{\u2605}+{\sigma}^{\u2605}/2+{b}^{\u2605})}^{1/2}$,$\mathrm{E}\left({\sigma}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\sigma}^{\u2605}+{b}^{\u2605}$, $\mathrm{Var}\left({\sigma}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\mu}^{\u2605\phantom{\rule{0.166667em}{0ex}}2}+3{\sigma}^{\u2605}$,$\mathrm{E}\left({b}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\mu}^{\u2605\phantom{\rule{0.166667em}{0ex}}2}/5+{b}^{\u2605}$, $\mathrm{Var}\left({b}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\sigma}^{\u2605}$.
- Case 2: $\mathrm{E}\left({\mu}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})=0$, $\mathrm{Var}\left({\mu}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\left({\mu}^{\u2605}\right)}^{2}+{\sigma}^{\u2605}/4+{b}^{\u2605}/10$,$\mathrm{E}\left({\sigma}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\sigma}^{\u2605}$, $\mathrm{Var}\left({\sigma}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\mu}^{\u2605\phantom{\rule{0.166667em}{0ex}}2}+{\sigma}^{\u2605}+{b}^{\u2605}$,$\mathrm{E}\left({b}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={b}^{\u2605}$, $\mathrm{Var}\left({b}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\mu}^{\u2605\phantom{\rule{0.166667em}{0ex}}2}+{\sigma}^{\u2605}$.
- Case 3: $\mathrm{E}\left({\mu}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\mu}^{\u2605}$, $\mathrm{Var}\left({\mu}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={({\sigma}^{\u2605}+{b}^{\u2605})}^{2}$,$\mathrm{E}\left({\sigma}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\sigma}^{\u2605}$, $\mathrm{Var}\left({\sigma}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\left(\right)}^{{\mu}^{\u2605}/2}2$.
- Case 4: $\mathrm{E}\left({\mu}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})=0$, $\mathrm{Var}\left({\mu}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\mu}^{\u2605\phantom{\rule{0.166667em}{0ex}}2}+{\sigma}^{\u2605}/4+{b}^{\u2605}/10$,$\mathrm{E}\left({\sigma}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\left({\mu}^{\u2605}\right)}^{2}+{\sigma}^{\u2605}$, $\mathrm{Var}\left({\sigma}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\mu}^{\u2605\phantom{\rule{0.166667em}{0ex}}2}+{\sigma}^{\u2605}/6+{b}^{\u2605}/10$,$\mathrm{E}\left({b}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})=9+{\mu}^{\u2605\phantom{\rule{0.166667em}{0ex}}2}+{b}^{\u2605}/10$, $\mathrm{Var}\left({b}_{t}\right|{\mu}^{\u2605},{\sigma}^{\u2605},{b}^{\u2605})={\mu}^{\u2605\phantom{\rule{0.166667em}{0ex}}2}+{\sigma}^{\u2605}/2$,

## 6. Empirical Application

## 7. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Johnson, N.L.; Kotz, S.; Balakrishnan, N. Continuous Univariate Distributions; Wiley: New York, NY, USA, 1994; Volume 1–2. [Google Scholar]
- Balakrishnan, N.; Nevzorov, V.B. A Primer on Statistical Distributions; Wiley: New York, NY, USA, 2004. [Google Scholar]
- Balakrishnan, N. Handbook of the Logistic Distribution; CRC Press: Boca Raton, FL, USA, 1991. [Google Scholar]
- Lai, C.D. Generalized Weibull distributions. In Generalized Weibull Distributions; Springer: New York, NY, USA, 2014; pp. 23–75. [Google Scholar]
- Dubey, S.D. A new derivation of the logistic distribution. Nav. Res. Logist. Q.
**1969**, 16, 37–40. [Google Scholar] [CrossRef] - Al-Marzouki, S.; Jamal, F.; Chesneau, C.; Elgarhy, M. Half logistic inverse Lomax distribution with applications. Symmetry
**2021**, 13, 309. [Google Scholar] [CrossRef] - Athayde, E.; Azevedo, A.; Barros, M.; Leiva, V. Failure rate of Birnbaum-Saunders distributions: Shape, change-point, estimation and robustness. Braz. J. Probab. Stat.
**2019**, 3, 301–328. [Google Scholar] [CrossRef][Green Version] - Afify, A.Z.; Altum, E.; Alizadeh, M.; Ozel, G.; Hamedani, G.G. The odd exponentiated half-logistic-G family: Properties, characterizations and applications. Chil. J. Stat.
**2017**, 8, 65–91. [Google Scholar] - Balakrishnan, N.; Hossain, A. Inference for the Type II generalized logistic distribution under progressive Type II censoring. J. Stat. Comput. Simul.
**2007**, 77, 1013–1031. [Google Scholar] [CrossRef] - Zelterman, D. Parameter estimation in the generalized logistic distribution. Comput. Stat. Data Anal.
**1987**, 5, 177–184. [Google Scholar] [CrossRef] - Sreekumar, N.; Thomas, P.Y. Estimation of the parameters of Type-I generalized logistic distribution using order statistics. Commun. Stat. Theory Methods
**2008**, 37, 1506–1524. [Google Scholar] [CrossRef] - Batchelor, R.A.; Orr, A.B. Inflation expectations revisited. Economica
**1988**, 55, 317–331. [Google Scholar] [CrossRef] - Tolikas, K.; Koulakiotis, A.; Brown, R.A. Extreme risk and value-at-risk in the German stock market. Eur. J. Financ.
**2007**, 13, 373–395. [Google Scholar] [CrossRef] - Walter, N.; Bergheim, S. Productivity, Growth Potential and Monetary Policy in EMU; Technical Report 42, Reports on European Integration; Publications Office of the European Union: Luxembourg, 2006.
- Hossain, A.; Willan, A.R. Approximate MLEs of the parameters of location-scale models under type II censoring. Statistics
**2007**, 41, 385–394. [Google Scholar] [CrossRef] - Lagos-Álvarez, B.; Ferreira, G.; Porcu, E. Modified maximum likelihood estimation in autoregressive processes with generalized exponential innovations. Open J. Stat.
**2014**, 4, 620. [Google Scholar] [CrossRef][Green Version] - Hossain, A.; Beyene, J.; Willan, A.R.; Hu, P. A flexible approximate likelihood ratio test for detecting differential expression in microarray data. Comput. Stat. Data Anal.
**2009**, 53, 3685–3695. [Google Scholar] [CrossRef] - Zeileis, A.; Windberger, T. Glogis: Fitting and Testing Generalized Logistic Distributions. R Package Version 1.0-1 2018. Available online: https://CRAN.R-project.org/package=glogis (accessed on 15 January 2022).
- Abberger, K. ML-Estimation in the Location-Scale-Shape Model of the Generalized Logistic Distribution; Discussion Paper Series/CoFE, Vol. 02/15; Konstanz Universitat: Konstanz, Germany, 2002. [Google Scholar]
- Swain, J.J.; Venkatraman, S.; Wilson, J.R. Least-squares estimation of distribution functions in Johnson’s translation system. J. Stat. Comput. Simul.
**1988**, 29, 271–297. [Google Scholar] [CrossRef] - Dey, S.; Alzaatreh, A.; Ghosh, I. Parameter estimation methods for the Weibull-Pareto distribution. Comput. Math. Methods
**2021**, 3, e1053. [Google Scholar] [CrossRef][Green Version] - Greenwood, J.A.; Landwehr, J.M.; Matalas, N.C.; Wallis, J.R. Probability weighted moments: Definition and relation to parameters of several distributions expressable in inverse form. Water Resour. Res.
**1979**, 15, 1049–1054. [Google Scholar] [CrossRef][Green Version] - Hosking, J.R.M. L-moments: Analysis and estimation of distributions using linear combinations of order statistics. J. R. Stat. Soc.
**1990**, 52, 105–124. [Google Scholar] [CrossRef] - Lillo, C.; Leiva, V.; Nicolis, O.; Aykroyd, R.G. L-moments of the Birnbaum-Saunders distribution and its extreme value version: Estimation, goodness of fit and application to earthquake data. J. Appl. Stat.
**2018**, 45, 187–209. [Google Scholar] [CrossRef] - Hosking, J.R.M. lmom: L-Moments. R Package Version 2.8 2019. Available online: https://CRAN.R-project.org/package=lmom (accessed on 15 January 2022).
- Kotz, S.; Leiva, V.; Sanhueza, A. Two new mixture models related to the inverse Gaussian distribution. Methodol. Comput. Appl. Probab.
**2010**, 12, 199–212. [Google Scholar] [CrossRef] - Balakrishnan, N.L.; Gupta, R.; Kundu, D.; Leiva, V.; Sanhueza, A. On some mixture models based on the Birnbaum-Saunders distribution and associated inference. J. Stat. Plan. Inference
**2011**, 141, 2175–2190. [Google Scholar] [CrossRef] - Dempster, A.P.; Laird, N.M.; Rubin, D.B. Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. B
**1977**, 39, 1–22. [Google Scholar] - Celeux, G.; Govaert, G. A classification EM algorithm for clustering and two stochastic versions. Comput. Stat. Data Anal.
**1992**, 14, 315–332. [Google Scholar] [CrossRef][Green Version] - Celeux, G.; Didier, C.; Diebolt, J. Stochastic versions of the EM algorithm: An experimental study in the mixture case. J. Stat. Comput. Simul.
**1996**, 55, 287–314. [Google Scholar] [CrossRef] - Raqab, M.Z.; Madi, M.T. Bayesian inference for the generalized exponential distribution. J. Stat. Comput. Simul.
**2005**, 75, 841–852. [Google Scholar] [CrossRef] - Balakrishnan, N.; Leung, M. Order statistics from the Type I generalized. Commun. Stat. Simul. Comput.
**1988**, 17, 25–50. [Google Scholar] [CrossRef] - Nassar, M.; Elmasry, A. A study of generalized logistic distributions. J. Egypt. Math. Soc.
**2012**, 20, 126–133. [Google Scholar] [CrossRef][Green Version] - Bernardo, J.M.; Smith, A.F. Bayesian Theory; Wiley: New York, NY, USA, 2009. [Google Scholar]
- Christen, J.A.; Fox, C. A general purpose sampling algorithm for continuous distributions (the t-walk). Bayesian Anal.
**2010**, 5, 263–281. [Google Scholar] [CrossRef] - Windberger, T.; Zeileis, A. Structural breaks in inflation dynamics within the European Monetary Union. East. Eur. Econ.
**2014**, 52, 66–88. [Google Scholar] [CrossRef][Green Version] - Harris, R.; Kanji, G. On the use of minimum chi-square estimation. J. R. Stat. Soc. D
**1983**, 32, 379–394. [Google Scholar] [CrossRef] - Barthélemy, J.; Suesse, T. mipfp: An R package for multidimensional array fitting and simulating multivariate Bernoulli distributions. J. Stat. Softw.
**2018**, 86, 1–20. [Google Scholar] [CrossRef][Green Version] - Lindley, D. Reconciliation of probability distributions. Oper. Res.
**1983**, 31, 866–880. [Google Scholar] [CrossRef] - Walters, C.; Ludwig, D. Calculation of Bayes posterior probability distributions for key population parameters. Can. J. Fish. Aquat. Sci.
**1994**, 51, 713–722. [Google Scholar] [CrossRef] - Gelman, A.; Rubin, D.B. A single series from the Gibbs sampler provides a false sense of security. Bayesian Stat.
**1992**, 4, 625–631. [Google Scholar] - Jergensen, G.V. Copper leaching, solvent extraction, and electrowinning technology. Int. J. Surf. Min. Reclam. Environ.
**1999**, 13. [Google Scholar] - Lagos-Álvarez, B.; Jiménez-Gamero, M.; Fernández, A. Bias correction in the type I generalized logistic distribution. Commun. Stat. Simul. Comput.
**2011**, 40, 511–531. [Google Scholar] [CrossRef] - Couri, L.; Ospina, R.; da Silva, G.; Leiva, V.; Figueroa-Zuniga, J. A study on computational algorithms in the estimation of parameters for a class of beta regression models. Mathematics
**2022**, 10, 299. [Google Scholar] [CrossRef] - Costa, E.; Santos-Neto, M.; Leiva, V. Optimal sample size for the Birnbaum-Saunders distribution under decision theory with symmetric and asymmetric loss functions. Symmetry
**2021**, 13, 926. [Google Scholar] [CrossRef] - Saulo, H.; Dasilva, A.; Leiva, V.; Sanchez, L.; de la Fuente-Mella, H. Log-symmetric quantile regression models. Stat. Neerl. 2022; in press. [Google Scholar] [CrossRef]
- Liu, Y.; Mao, G.; Leiva, V.; Liu, S.; Tapia, A. Diagnostic analytics for an autoregressive model under the skew-normal distribution. Mathematics
**2020**, 8, 693. [Google Scholar] [CrossRef] - Martinez, S.; Giraldo, R.; Leiva, V. Birnbaum–Saunders functional regression models for spatial data. Stoch. Environ. Res. Risk Assess.
**2019**, 33, 1765–1780. [Google Scholar] [CrossRef] - Huerta, M.; Leiva, V.; Liu, S.; Rodriguez, M.; Villegas, D. On a partial least squares regression model for asymmetric data with a chemical application in mining. Chemom. Intell. Lab. Syst.
**2019**, 190, 55–68. [Google Scholar] [CrossRef] - Figueroa-Zuniga, J.; Bayes, C.L.; Leiva, V.; Liu, S. Robust beta regression modeling with errors-in-variables: A Bayesian approach and numerical applications. Stat. Pap. 2022; in press. [Google Scholar] [CrossRef]

**Figure 1.**PDF of the IGL distribution (

**a**) and d Skew${}_{M}(\alpha ,\beta )/\mathrm{d}\alpha $ of the IVGL distribution (

**b**) for the indicated values of the parameters.

**Figure 3.**Behavior of the Jeffreys priors, here denoted as z, for $(\sigma ,b)\in $: $(0,0.5)\times (0,0.5)$ (

**a**); $(0,0.5)\times (4,6)$ (

**b**); $(4,6)\times (0,0.5)$ (

**c**); and $(5,7)\times (9,11)$ (

**d**).

**Figure 4.**Histogram and fitted PDF (

**a**) and empirical distribution function with fitted cumulative distribution function (

**b**) and parameters estimated via classical and Bayesian methods using PLS daily flow data.

**Figure 5.**ACF of the chain $\mu \phantom{\rule{0.222222em}{0ex}}|\phantom{\rule{0.222222em}{0ex}}{W}_{n}$ (

**a**), $\sigma \phantom{\rule{0.222222em}{0ex}}|\phantom{\rule{0.222222em}{0ex}}{W}_{n}$, (

**b**), and $b\phantom{\rule{0.222222em}{0ex}}|\phantom{\rule{0.222222em}{0ex}}{W}_{n}$ (

**c**) with PLS daily flow data.

$\mathit{\mu}$ | $\mathit{\sigma}$ | b | $\mathit{n}=15$ | $\mathit{n}=30$ | $\mathit{n}=50$ | $\mathit{n}=100$ |
---|---|---|---|---|---|---|

0 | 2 | 0.05 | 93.5% | 84.5% | 83.0% | 77.0% |

0 | 4 | 0.1 | 94.5% | 85.5% | 84.5% | 80.1% |

0 | 1 | 1 | 98.0% | 98.5% | 99.5% | 100.0% |

0 | 6 | 10 | 88.5% | 80.5% | 82.0% | 70.5% |

**Table 2.**Estimates with the moment method for 1000 replicates of the listed size sample, case, parameter, and indicator.

$n=15$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −19.09638 | 13.31917 | 0.49371 | −18.81575 | 13.75909 | 0.51013 | |

Bias | −19.09638 | 11.31917 | 0.44371 | −18.81575 | 9.75909 | 0.41013 | |

Relative bias | - | 5.65958 | 8.8742 | - | 2.43977 | 4.1013 | |

Standard deviation | 14.25511 | 4.4811 | 0.21426 | 14.80999 | 4.53213 | 0.22428 | |

MSE | 567.87987 | 148.20384 | 0.24278 | 573.36827 | 115.78 | 0.21851 | |

$n=15$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | −0.2572 | 0.98217 | 1.39323 | 11.42986 | 4.90052 | 2.78161 | |

Bias | −0.2572 | −0.01783 | 0.39323 | 11.42986 | -1.09948 | −7.21839 | |

Relative bias | - | −0.01783 | 0.39323 | - | −0.18325 | −0.72184 | |

Standard deviation | 1.0315 | 0.26367 | 1.42171 | 5.86088 | 1.39999 | 3.27409 | |

MSE | 1.13014 | 0.06984 | 2.17588 | 164.99163 | 3.16883 | 62.82485 | |

$n=30$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −14.43447 | 11.63792 | 0.38711 | −14.29753 | 12.14067 | 0.4039 | |

Bias | −14.43447 | 9.63792 | 0.33711 | −14.29753 | 8.14067 | 0.3039 | |

Relative bias | - | 4.81896 | 6.7422 | - | 2.03517 | 3.039 | |

Standard deviation | 9.2901 | 3.64442 | 0.14531 | 9.57106 | 3.64783 | 0.15087 | |

MSE | 294.6599 | 106.17127 | 0.13476 | 296.0245 | 79.57715 | 0.11512 | |

$n=30$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | −0.1717 | 0.97755 | 1.468 | 11.33066 | 4.89198 | 3.03369 | |

Bias | −0.1717 | −0.02245 | 0.468 | 11.33066 | −1.10802 | −6.96631 | |

Relative bias | - | −0.02245 | 0.468 | - | −0.18467 | −0.69663 | |

Standard deviation | 1.04284 | 0.2548 | 3.32295 | 5.87308 | 1.14218 | 4.2669 | |

MSE | 1.117 | 0.06543 | 11.26101 | 162.87692 | 2.53227 | 66.73593 | |

$n=50$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −10.79721 | 10.10306 | 0.31954 | −10.5922 | 10.65166 | 0.33576 | |

Bias | −10.79721 | 8.10306 | 0.26954 | -10.5922 | 6.65166 | 0.23576 | |

Relative bias | - | 4.05153 | 5.3908 | - | 1.66291 | 2.3576 | |

Standard deviation | 8.38061 | 3.34231 | 0.13118 | 8.62776 | 3.34787 | 0.13503 | |

MSE | 186.81438 | 76.83061 | 0.08986 | 186.63291 | 55.45283 | 0.07382 | |

$n=50$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | −0.12718 | 0.97281 | 1.30513 | 9.72513 | 5.09282 | 3.83445 | |

Bias | −0.12718 | −0.02719 | 0.30513 | 9.72513 | −0.90718 | −6.16555 | |

Relative bias | - | −0.02719 | 0.30513 | - | −0.1512 | −0.61656 | |

Standard deviation | 0.92368 | 0.21061 | 1.61055 | 5.55714 | 0.89339 | 5.36295 | |

MSE | 0.86936 | 0.04509 | 2.68698 | 125.45997 | 1.62112 | 66.77523 | |

$n=100$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −8.57027 | 9.05183 | 0.272 | −8.21102 | 9.56039 | 0.28589 | |

Bias | −8.57027 | 7.05183 | 0.222 | −8.21102 | 5.56039 | 0.18589 | |

Relative bias | - | 3.52591 | 4.44 | - | 1.3901 | 1.8589 | |

Standard deviation | 5.91348 | 2.60411 | 0.09776 | 6.22217 | 2.70276 | 0.10234 | |

MSE | 108.41877 | 56.50969 | 0.05884 | 106.13622 | 38.22284 | 0.04503 | |

$n=100$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | −0.19489 | 1.00876 | 1.23021 | 7.13316 | 5.4028 | 5.68398 | |

Bias | −0.19489 | 0.00876 | 0.23021 | 7.13316 | −0.5972 | −4.31602 | |

Relative bias | - | 0.00876 | 0.23021 | - | −0.09953 | −0.4316 | |

Standard deviation | 0.67739 | 0.15986 | 0.68861 | 5.93529 | 0.75779 | 7.43532 | |

MSE | 0.49684 | 0.02563 | 0.52718 | 86.10964 | 0.93089 | 73.91197 |

**Table 3.**Bayesian estimate (posterior mean) for 1000 replicates of the listed size sample, case, parameter, and indicator.

$n=15$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −0.019 | 3.102 | 0.081 | 0.061 | 3.711 | 0.093 | |

Bias | −0.019 | 1.102 | 0.031 | 0.061 | −0.289 | −0.007 | |

Relative bias | - | 0.551 | 0.62 | - | −0.072 | −0.07 | |

Standard deviation | 0.08 | 0.632 | 0.024 | 0.406 | 0.157 | 0.024 | |

MSE | 0.007 | 1.614 | 0.002 | 0.169 | 0.108 | 0.001 | |

$n=15$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | 0.086 | 0.962 | 1.363 | 0.084 | 5.918 | 10.517 | |

Bias | 0.086 | −0.038 | 0.363 | 0.084 | −0.082 | 0.517 | |

Relative bias | - | −0.038 | 0.363 | - | −0.014 | 0.052 | |

Standard deviation | 0.85 | 0.331 | 0.683 | 0.172 | 0.746 | 0.742 | |

MSE | 0.729 | 0.111 | 0.598 | 0.037 | 0.563 | 0.818 | |

$n=30$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −0.025 | 2.737 | 0.07 | 0.062 | 3.705 | 0.093 | |

Bias | −0.025 | 0.737 | 0.02 | 0.062 | −0.295 | −0.007 | |

Relative bias | - | 0.368 | 0.4 | - | −0.074 | −0.07 | |

Standard deviation | 0.122 | 0.719 | 0.02 | 0.558 | 0.221 | 0.018 | |

MSE | 0.015 | 1.061 | 0.001 | 0.315 | 0.136 | 0 | |

$n=30$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | −0.097 | 0.996 | 1.419 | 0.088 | 5.956 | 10.429 | |

Bias | −0.097 | −0.004 | 0.419 | 0.088 | −0.044 | 0.429 | |

Relative bias | - | −0.004 | 0.419 | - | −0.007 | 0.043 | |

Standard deviation | 0.836 | 0.267 | 0.761 | 0.21 | 0.581 | 0.949 | |

MSE | 0.709 | 0.072 | 0.755 | 0.052 | 0.339 | 1.085 | |

$n=50$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −0.007 | 2.502 | 0.064 | 0.086 | 3.71 | 0.093 | |

Bias | −0.007 | 0.502 | 0.014 | 0.086 | −0.29 | −0.007 | |

Relative bias | - | 0.251 | 0.28 | - | −0.072 | −0.07 | |

Standard deviation | 0.151 | 0.666 | 0.018 | 0.634 | 0.241 | 0.014 | |

MSE | 0.023 | 0.696 | 0.001 | 0.409 | 0.142 | 0 | |

$n=50$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | −0.186 | 1.017 | 1.404 | 0.078 | 5.961 | 10.379 | |

Bias | −0.186 | 0.017 | 0.404 | 0.078 | −0.039 | 0.379 | |

Relative bias | - | 0.017 | 0.404 | - | −0.006 | 0.038 | |

Standard deviation | 0.754 | 0.196 | 0.754 | 0.238 | 0.47 | 1.063 | |

MSE | 0.604 | 0.039 | 0.731 | 0.063 | 0.223 | 1.273 | |

$n=100$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −0.012 | 2.383 | 0.061 | 0.05 | 3.779 | 0.096 | |

Bias | −0.012 | 0.383 | 0.011 | 0.05 | −0.221 | −0.004 | |

Relative bias | - | 0.192 | 0.22 | - | −0.055 | −0.040 | |

Standard deviation | 0.194 | 0.572 | 0.016 | 0.707 | 0.272 | 0.011 | |

MSE | 0.038 | 0.474 | 0 | 0.502 | 0.123 | 0 | |

$n=100$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | −0.221 | 1.027 | 1.314 | 0.067 | 5.96 | 10.389 | |

Bias | −0.221 | 0.027 | 0.314 | 0.067 | −0.04 | 0.389 | |

Relative bias | - | 0.027 | 0.314 | - | −0.007 | 0.039 | |

Standard deviation | 0.561 | 0.151 | 0.531 | 0.187 | 0.357 | 0.879 | |

MSE | 0.364 | 0.023 | 0.38 | 0.039 | 0.129 | 0.924 |

**Table 4.**Bayesian estimate (posterior median) for 1000 replicates of the listed size sample, case, parameter, and indicator.

$n=15$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −0.026 | 2.747 | 0.069 | 0.065 | 3.638 | 0.087 | |

Bias | −0.026 | 0.747 | 0.019 | 0.065 | −0.362 | −0.013 | |

Relative bias | - | 0.374 | 0.38 | - | −0.09 | −0.13 | |

Standard deviation | 0.095 | 0.624 | 0.023 | 0.398 | 0.158 | 0.022 | |

MSE | 0.01 | 0.947 | 0.001 | 0.163 | 0.156 | 0.001 | |

$n=15$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | 0.22 | 0.907 | 0.935 | 0.081 | 5.84 | 10.267 | |

Bias | 0.22 | −0.093 | −0.065 | 0.081 | −0.16 | 0.267 | |

Relative bias | - | −0.093 | −0.065 | - | −0.027 | 0.027 | |

Standard deviation | 0.924 | 0.346 | 0.513 | 0.191 | 0.724 | 0.728 | |

MSE | 0.903 | 0.128 | 0.267 | 0.043 | 0.549 | 0.602 | |

$n=30$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −0.034 | 2.472 | 0.062 | 0.064 | 3.632 | 0.089 | |

Bias | −0.034 | 0.472 | 0.012 | 0.064 | −0.368 | −0.011 | |

Relative bias | - | 0.236 | 0.24 | - | −0.092 | −0.11 | |

Standard deviation | 0.141 | 0.717 | 0.02 | 0.543 | 0.22 | 0.017 | |

MSE | 0.021 | 0.737 | 0.001 | 0.299 | 0.184 | 0 | |

$n=30$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | 0.017 | 0.966 | 1.079 | 0.092 | 5.903 | 10.214 | |

Bias | 0.017 | −0.034 | 0.079 | 0.092 | −0.097 | 0.214 | |

Relative bias | - | −0.034 | 0.079 | - | −0.016 | 0.021 | |

Standard deviation | 0.845 | 0.27 | 0.571 | 0.245 | 0.568 | 0.935 | |

MSE | 0.714 | 0.074 | 0.332 | 0.068 | 0.332 | 0.92 | |

$n=50$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −0.014 | 2.304 | 0.058 | 0.09 | 3.644 | 0.09 | |

Bias | −0.014 | 0.304 | 0.008 | 0.09 | −0.356 | −0.01 | |

Relative bias | - | 0.152 | 0.16 | - | −0.089 | −0.1 | |

Standard deviation | 0.18 | 0.672 | 0.018 | 0.624 | 0.239 | 0.014 | |

MSE | 0.033 | 0.545 | 0 | 0.398 | 0.184 | 0 | |

$n=50$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | −0.089 | 0.996 | 1.142 | 0.085 | 5.922 | 10.185 | |

Bias | −0.089 | −0.004 | 0.142 | 0.085 | −0.078 | 0.185 | |

Relative bias | - | −0.004 | 0.142 | - | −0.013 | 0.018 | |

Standard deviation | 0.751 | 0.197 | 0.57 | 0.282 | 0.46 | 1.045 | |

MSE | 0.572 | 0.039 | 0.346 | 0.087 | 0.217 | 1.126 | |

$n=100$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Mean | −0.022 | 2.258 | 0.057 | 0.064 | 3.72 | 0.093 | |

Bias | −0.022 | 0.258 | 0.007 | 0.064 | −0.28 | −0.007 | |

Relative bias | - | 0.129 | 0.14 | - | −0.07 | −0.07 | |

Standard deviation | 0.231 | 0.582 | 0.016 | 0.693 | 0.272 | 0.011 | |

MSE | 0.054 | 0.406 | 0 | 0.485 | 0.152 | 0 | |

$n=100$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Mean | −0.154 | 1.015 | 1.16 | 0.076 | 5.938 | 10.218 | |

Bias | −0.154 | 0.015 | 0.16 | 0.076 | −0.062 | 0.218 | |

Relative bias | - | 0.015 | 0.16 | - | −0.01 | 0.022 | |

Standard deviation | 0.554 | 0.152 | 0.43 | 0.223 | 0.349 | 0.856 | |

MSE | 0.331 | 0.023 | 0.211 | 0.055 | 0.125 | 0.781 |

**Table 5.**Results of the diagnostics for 1000 replicates of the listed size sample, case, parameter, and test.

$n=15$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Gelman-Rubin | - | 1.2081 | - | - | 1.0644 | - | |

Geweke | 0.9516 | 0.648 | 0.6535 | 0.6399 | 0.3411 | 0.1022 | |

Ljung-Box | 0.112 | 0.2358 | 0.2497 | 0.0729 | 0.2563 | 0.1289 | |

$n=15$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Gelman-Rubin | - | 1.1755 | - | - | 1.0706 | - | |

Geweke | 0.2091 | 0.438 | 0.3774 | 0.7777 | 0.5109 | 0.7443 | |

Ljung-Box | 0.525 | 0.4431 | 0.2327 | 0.6502 | 0.1333 | 0.1462 | |

$n=30$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Gelman-Rubin | - | 1.017 | - | - | 1.0218 | - | |

Geweke | 0.2203 | 0.0943 | 0.1702 | 0.9785 | 0.9788 | 0.427 | |

Ljung-Box | 0.6654 | 0.0271 | 0.0022 | 0.737 | 0.4986 | 0.7545 | |

$n=30$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Gelman-Rubin | - | 1.0758 | - | - | 1.0414 | - | |

Geweke | 0.5641 | 0.2097 | 0.4419 | 0.722 | 0.5419 | 0.7722 | |

Ljung-Box | 0.2461 | 0.8968 | 0.193 | 0.0779 | 0.0902 | 0.3097 | |

$n=50$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Gelman-Rubin | - | 1.0209 | - | - | 1.0305 | - | |

Geweke | 0.8796 | 0.542 | 0.6853 | 0.5699 | 0.2915 | 0.5914 | |

Ljung-Box | 0.0822 | 0.4869 | 0.2854 | 0.2587 | 0.7124 | 0.6139 | |

$n=50$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Gelman-Rubin | - | 1.1348 | - | - | 1.1341 | - | |

Geweke | 0.5635 | 0.4084 | 0.5817 | 0.9604 | 0.7169 | 0.6362 | |

Ljung-Box | 0.1675 | 0.6055 | 0.0738 | 0.4626 | 0.2712 | 0.0961 | |

$n=100$ | Cases 1–2 | $\mu =0$ | $\sigma =2$ | $b=0.05$ | $\mu =0$ | $\sigma =4$ | $b=0.1$ |

Gelman-Rubin | - | 1.1659 | - | - | 1.0252 | - | |

Geweke | 0.6261 | 0.5756 | 0.8681 | 0.8935 | 0.6845 | 0.9877 | |

Ljung-Box | 0.0754 | 0.1719 | 0.0878 | 0.0942 | 0.2024 | 0.5346 | |

$n=100$ | Cases 3–4 | $\mu =0$ | $\sigma =1$ | $b=1$ | $\mu =0$ | $\sigma =6$ | $b=10$ |

Gelman-Rubin | - | 1.0038 | - | - | 1.1379 | - | |

Geweke | 0.4396 | 0.1397 | 0.4124 | 0.0997 | 0.8876 | 0.5265 | |

Ljung-Box | 0.7793 | 0.1469 | 0.3915 | 0.1579 | 0.2634 | 0.1191 |

Mean | Median | Variance | Standard Deviation | Coefficient of Skewness |
---|---|---|---|---|

10,739.57 | 10,999.74 | 318,485.03 | 564.34 | −3.50 |

Parameter | $\mathit{\mu}$ | $\mathit{\sigma}$ | b |
---|---|---|---|

p-value | 0.60 | 0.34 | 0.04 |

**Table 8.**Classical and Bayesian estimates of $\theta ={(\mu ,\sigma ,b)}^{\top}$ with PLS daily flow data.

Parameter | |||
---|---|---|---|

Indicator | $\mathit{\mu}$ | $\mathit{\sigma}$ | b |

Estimate | 604.75 | 1184.63 | 3866.72 |

Posterior mean | 11,000.27 | 177.55 | 0.49 |

Posterior median | 11,000.24 | 177.55 | 0.49 |

Variance | 1.18 | 4.15 | 0.00 |

Standar deviation | 1.09 | 2.04 | 0.02 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Lagos-Álvarez, B.; Jerez-Lillo, N.; Navarrete, J.P.; Figueroa-Zúñiga, J.; Leiva, V.
A Type I Generalized Logistic Distribution: Solving Its Estimation Problems with a Bayesian Approach and Numerical Applications Based on Simulated and Engineering Data. *Symmetry* **2022**, *14*, 655.
https://doi.org/10.3390/sym14040655

**AMA Style**

Lagos-Álvarez B, Jerez-Lillo N, Navarrete JP, Figueroa-Zúñiga J, Leiva V.
A Type I Generalized Logistic Distribution: Solving Its Estimation Problems with a Bayesian Approach and Numerical Applications Based on Simulated and Engineering Data. *Symmetry*. 2022; 14(4):655.
https://doi.org/10.3390/sym14040655

**Chicago/Turabian Style**

Lagos-Álvarez, Bernardo, Nixon Jerez-Lillo, Jean P. Navarrete, Jorge Figueroa-Zúñiga, and Víctor Leiva.
2022. "A Type I Generalized Logistic Distribution: Solving Its Estimation Problems with a Bayesian Approach and Numerical Applications Based on Simulated and Engineering Data" *Symmetry* 14, no. 4: 655.
https://doi.org/10.3390/sym14040655