Variational Bayesian Estimation of Quantile Nonlinear Dynamic Latent Variable Models with Possible Nonignorable Missingness
Abstract
1. Introduction
2. Model
2.1. Quantile Nonlinear DLVM
2.2. Mechanism of Missing Data
2.3. The Missing Covariates Distribution
3. Variational Bayesian Inference
Variational Bayes
4. Simulation Studies
5. A Real Example
6. Discussion
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A. Calculation the Evidence Lower Bound (ELB)
References
- Diener, E.; Fujita, F.; Smith, H. The personality structure of affect. J. Personal. Soc. Psychol. 1995, 69, 130–141. [Google Scholar] [CrossRef]
- Chow, S.M.; Nesselroade, J.R.; Shifren, K.; Mcardle, J.J. Dynamic structure of emotions among individuals with parkinson’s disease. Struct. Equ. Model. Multidiscip. J. 2004, 11, 560–582. [Google Scholar] [CrossRef]
- Zhang, Z.Y.; Nesselroade, J.R. Bayesian Estimation of Categorical Dynamic Factor Models. Multivar. Behav. Res. 2007, 42, 729–756. [Google Scholar] [CrossRef]
- Chow, S.M.; Tang, N.S.; Yuan, Y.; Song, X.Y.; Zhu, H.T. Bayesian estimation of semiparametric dynamic latent variable models using the dirichlet process prior. Br. J. Math. Stat. Psychol. 2011, 64, 69–106. [Google Scholar] [CrossRef] [PubMed]
- Tang, N.S.; Chow, S.M.; Ibrahim, J.G.; Zhu, H.T. Bayesian sensitivity analysis of a nonlinear dynamic factor analysis model with nonparametric prior and possible nonignorable missingness. Psychometrika 2017, 82, 875–903. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.Q.; Tang, N.S. Bayesian quantile regression with mixed discrete and nonignorable missing covariates. Bayesian Anal. 2020, 15, 579–604. [Google Scholar] [CrossRef]
- Tuerde, M.; Tang, N.S. Bayesian semiparametric approach to quantile nonlinear dynamic factor analysis models with mixed ordered and nonignorable missing data. Statistics 2022, 56, 1166–1192. [Google Scholar] [CrossRef]
- Little, R.J.A.; Rubin, D.B. Statistical Analysis with Missing Data, 3rd ed.; John Wiley & Sons: New York, NY, USA, 2019. [Google Scholar]
- Bańbura, M.; Modugno, M. Maximum likelihood estimation of factor models on datasets with arbitrary pattern of missing data. J. Appl. Econ. 2014, 29, 133–160. [Google Scholar] [CrossRef]
- Jungbacker, B.; Koopman, S.J.; van der Wel, M. Maximum likelihood estimation for dynamic factor models with missing data. J. Econ. Dyn. Control 2011, 35, 1358–1368. [Google Scholar] [CrossRef]
- Stock, J.H.; Mark, W.W. Macroeconomic Forecasting Using Diffusion Indexes. J. Bus. Econ. Stat. 2002, 20, 147–295. [Google Scholar] [CrossRef]
- Stock, J.H.; Watson, M.W. Dynamic Factor Models, Factor-Augmented Vector Autoregressions, and Structural Vector Autoregressions in Macroeconomics. In Handbook of Macroeconomics; Elsevier B.V.: Amsterdam, The Netherlands, 2016; pp. 415–525. [Google Scholar]
- Kozumi, H.; Kobayashi, G. Gibbs sampling methods for Bayesian quantile regression. J. Stat. Comput. Simul. 2011, 81, 1565–1578. [Google Scholar] [CrossRef]
- Tang, A.M.; Tang, N.S. Semiparametric Bayesian inference on skew-normal joint modeling of multivariate longitudinal and survival data. Stat. Med. 2015, 34, 824–843. [Google Scholar] [CrossRef] [PubMed]
- Blei, D.; Jordan, M.I. Variational inference for Dirichlet process mixtures. Bayesian Analysis 2006, 1, 121–143. [Google Scholar] [CrossRef]
- Lee, S.Y.; Tang, N.S. Analysis of nonlinear structural equation models with nonignorable missing covariates and ordered categorical data. Stat. Sin. 2006, 16, 1117–1141. [Google Scholar]
- Beal, M.J. Variational Algorithms for Approximate Bayesian Inference. Ph.D. Thesis, University of London, London, UK, 2003. [Google Scholar]
- Bishop, C. Pattern Recognition and Machine Learning; Springer: New York, NY, USA, 2006. [Google Scholar]
- Blei, D.M.; Kucukelbir, A.; McAuliffe, J.D. Variational inference: A review for statisticians. J. Am. Stat. Assoc. 2017, 518, 859–877. [Google Scholar] [CrossRef]
- Durante, D.; Rigon, T. Conditionally Conjugate Mean-Field Variational Bayes for Logistic Models. Stat. Sci. 2019, 34, 472–485. [Google Scholar] [CrossRef]
- Zhu, H.T.; Ibrahim, J.G.; Tang, N.S. Bayesian influence analysis: A geometric approach. Biometrika 2011, 98, 307–323. [Google Scholar] [CrossRef] [PubMed]
Par | Case1 | Case2 | |||||||||||
Bias | RMS | Bias | RMS | Bias | RMS | Bias | RMS | Bias | RMS | Bias | RMS | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.016 | 0.033 | −0.008 | 0.024 | −0.019 | 0.031 | 0.007 | 0.032 | −0.028 | 0.043 | −0.057 | 0.070 | ||
0.016 | 0.032 | −0.008 | 0.024 | −0.020 | 0.031 | 0.004 | 0.031 | −0.029 | 0.045 | −0.060 | 0.075 | ||
0.017 | 0.031 | −0.009 | 0.028 | −0.020 | 0.031 | 0.002 | 0.032 | −0.029 | 0.044 | −0.060 | 0.075 | ||
0.015 | 0.031 | −0.008 | 0.026 | −0.019 | 0.031 | 0.007 | 0.033 | −0.029 | 0.045 | −0.062 | 0.076 | ||
0.019 | 0.037 | −0.007 | 0.024 | −0.018 | 0.030 | 0.011 | 0.030 | −0.030 | 0.044 | −0.064 | 0.078 | ||
0.021 | 0.036 | −0.008 | 0.028 | −0.017 | 0.031 | 0.007 | 0.030 | −0.031 | 0.044 | 0.062 | 0.077 | ||
0.042 | 0.113 | 0.004 | 0.112 | −0.029 | 0.123 | 0.109 | 0.154 | 0.101 | 0.152 | 0.077 | 0.157 | ||
0.047 | 0.100 | 0.007 | 0.093 | −0.027 | 0.102 | 0.095 | 0.131 | 0.062 | 0.111 | 0.029 | 0.112 | ||
0.048 | 0.097 | 0.001 | 0.087 | −0.030 | 0.102 | 0.097 | 0.132 | 0.061 | 0.110 | 0.033 | 0.112 | ||
0.046 | 0.099 | 0.002 | 0.094 | −0.031 | 0.101 | 0.093 | 0.129 | 0.066 | 0.110 | 0.027 | 0.116 | ||
0.034 | 0.116 | 0.007 | 0.104 | −0.030 | 0.123 | 0.116 | 0.157 | 0.105 | 0.151 | 0.095 | 0.163 | ||
0.040 | 0.102 | 0.002 | 0.099 | −0.036 | 0.093 | 0.098 | 0.131 | 0.068 | 0.109 | 0.048 | 0.109 | ||
0.042 | 0.104 | 0.005 | 0.099 | −0.033 | 0.095 | 0.100 | 0.133 | 0.066 | 0.106 | 0.048 | 0.107 | ||
0.044 | 0.103 | 0.003 | 0.101 | −0.032 | 0.092 | 0.097 | 0.128 | 0.067 | 0.105 | 0.042 | 0.109 | ||
0.017 | 0.079 | 0.029 | 0.081 | 0.083 | 0.117 | −0.017 | 0.079 | 0.068 | 0.104 | 0.029 | 0.061 | ||
0.013 | 0.046 | 0.001 | 0.045 | −0.008 | 0.049 | 0.014 | 0.045 | −0.002 | 0.053 | −0.027 | 0.065 | ||
0.014 | 0.066 | 0.037 | 0.078 | 0.075 | 0.110 | −0.018 | 0.079 | 0.074 | 0.109 | 0.291 | 0.318 | ||
−0.167 | 0.293 | −0.012 | 0.090 | 0.042 | 0.084 | −0.059 | 0.121 | 0.141 | 0.163 | 0.129 | 0.155 | ||
0.165 | 0.275 | 0.043 | 0.078 | −0.051 | 0.083 | 0.087 | 0.102 | −0.103 | 0.111 | −0.167 | 0.182 | ||
−0.168 | 0.272 | −0.091 | 0.101 | −0.027 | 0.059 | −0.134 | 0.139 | −0.017 | 0.038 | 0.028 | 0.051 | ||
Par | Case3 | ||||||||||||
Bias | RMS | Bias | RMS | Bias | RMS | ||||||||
0.017 | 0.040 | −0.006 | 0.026 | −0.024 | 0.037 | ||||||||
0.019 | 0.040 | −0.006 | 0.024 | −0.019 | 0.035 | ||||||||
0.021 | 0.039 | −0.010 | 0.029 | −0.018 | 0.032 | ||||||||
0.025 | 0.041 | −0.005 | 0.025 | −0.015 | 0.036 | ||||||||
0.025 | 0.040 | −0.008 | 0.026 | −0.018 | 0.034 | ||||||||
0.026 | 0.041 | −0.006 | 0.026 | −0.019 | 0.037 | ||||||||
0.019 | 0.114 | −0.027 | 0.125 | −0.043 | 0.121 | ||||||||
0.029 | 0.101 | −0.021 | 0.103 | −0.042 | 0.101 | ||||||||
0.025 | 0.101 | −0.018 | 0.097 | −0.044 | 0.097 | ||||||||
0.033 | 0.101 | −0.020 | 0.103 | −0.044 | 0.097 | ||||||||
0.036 | 0.118 | 0.023 | 0.120 | −0.019 | 0.128 | ||||||||
0.042 | 0.103 | 0.016 | 0.092 | −0.020 | 0.100 | ||||||||
0.041 | 0.104 | 0.012 | 0.093 | −0.017 | 0.100 | ||||||||
0.037 | 0.098 | 0.015 | 0.094 | −0.012 | 0.097 | ||||||||
0.013 | 0.076 | 0.033 | 0.075 | 0.086 | 0.117 | ||||||||
0.019 | 0.051 | −0.003 | 0.045 | −0.004 | 0.054 | ||||||||
0.001 | 0.076 | 0.032 | 0.082 | 0.084 | 0.115 | ||||||||
−0.189 | 0.234 | −0.020 | 0.084 | 0.027 | 0.086 | ||||||||
0.110 | 0.204 | 0.035 | 0.069 | −0.041 | 0.078 | ||||||||
−0.202 | 0.241 | −0.080 | 0.093 | −0.028 | 0.057 |
Par | Par | |||||||||||||
Bias | RMS | Bias | RMS | Bias | RMS | Bias | RMS | Bias | RMS | Bias | RMS | |||
−0.003 | 0.023 | −0.006 | 0.027 | 0.002 | 0.025 | −0.078 | 0.134 | 0.083 | 0.121 | 0.087 | 0.132 | |||
0.005 | 0.022 | −0.007 | 0.031 | −0.002 | 0.024 | 0.086 | 0.133 | −0.084 | 0.123 | 0.088 | 0.131 | |||
−0.007 | 0.025 | −0.012 | 0.029 | 0.005 | 0.025 | −0.111 | 0.151 | 0.124 | 0.108 | −0.088 | 0.133 | |||
0.011 | 0.021 | 0.006 | 0.026 | −0.003 | 0.024 | 0.093 | 0.152 | −0.126 | 0.123 | 0.077 | 0.117 | |||
0.006 | 0.025 | 0.003 | 0.027 | −0.002 | 0.026 | 0.087 | 0.142 | 0.084 | 0.116 | 0.088 | 0.124 | |||
0.013 | 0.026 | −0.005 | 0.028 | 0.005 | 0.028 | −0.093 | 0.132 | 0.084 | 0.124 | −0.084 | 0.116 | |||
0.088 | 0.133 | −0.081 | 0.144 | 0.084 | 0.146 | −0.089 | 0.161 | −0.089 | 0.133 | 0.087 | 0.139 | |||
0.094 | 0.146 | −0.099 | 0.139 | 0.088 | 0.144 | −0.092 | 0.144 | 0.067 | 0.125 | 0.081 | 0.129 | |||
0.083 | 0.127 | −0.106 | 0.154 | 0.091 | 0.131 | −0.015 | 0.034 | 0.015 | 0.021 | −0.015 | 0.022 | |||
0.102 | 0.139 | 0.115 | 0.167 | 0.085 | 0.145 | −0.011 | 0.031 | 0.017 | 0.027 | −0.021 | 0.029 | |||
0.087 | 0.142 | 0.083 | 0.131 | 0.088 | 0.153 | −0.017 | 0.032 | 0.011 | 0.025 | −0.021 | 0.034 | |||
0.094 | 0.131 | 0.087 | 0.133 | 0.104 | 0.152 | −0.014 | 0.033 | 0.017 | 0.031 | −0.009 | 0.022 | |||
0.086 | 0.119 | 0.072 | 0.126 | 0.085 | 0.139 | −0.016 | 0.026 | 0.010 | 0.021 | −0.016 | 0.030 | |||
0.077 | 0.122 | 0.082 | 0.124 | 0.064 | 0.125 | −0.011 | 0.017 | 0.012 | 0.022 | −0.013 | 0.024 | |||
0.075 | 0.149 | 0.074 | 0.123 | 0.066 | 0.127 | −0.013 | 0.022 | 0.016 | 0.022 | −0.016 | 0.027 | |||
0.058 | 0.123 | 0.061 | 0.117 | 0.071 | 0.111 | −0.015 | 0.026 | 0.011 | 0.021 | −0.017 | 0.025 | |||
0.062 | 0.119 | 0.066 | 0.112 | 0.059 | 0.114 | 0.084 | 0.143 | −0.092 | 0.173 | −0.065 | 0.166 | |||
0.061 | 0.125 | 0.064 | 0.094 | 0.062 | 0.116 | −0.085 | 0.155 | −0.089 | 0.146 | −0.051 | 0.147 | |||
0.055 | 0.122 | 0.073 | 0.124 | 0.077 | 0.127 | −0.023 | 0.095 | −0.055 | 0.176 | −0.073 | 0.164 | |||
0.054 | 0.094 | 0.075 | 0.124 | 0.073 | 0.126 | 0.044 | 0.151 | 0.088 | 0.181 | −0.068 | 0.151 | |||
0.057 | 0.091 | 0.063 | 0.119 | 0.069 | 0.115 | −0.036 | 0.166 | −0.063 | 0.154 | 0.091 | 0.172 | |||
0.051 | 0.087 | 0.061 | 0.111 | 0.065 | 0.083 | −0.073 | 0.153 | −0.077 | 0.159 | −0.071 | 0.178 | |||
0.092 | 0.114 | 0.085 | 0.089 | 0.067 | 0.083 | −0.084 | 0.155 | −0.069 | 0.178 | −0.063 | 0.147 | |||
−0.082 | 0.091 | 0.067 | 0.077 | −0.066 | 0.065 | −0.067 | 0.148 | −0.084 | 0.179 | −0.067 | 0.151 | |||
−0.083 | 0.092 | 0.066 | 0.076 | −0.063 | 0.071 | −0.092 | 0.144 | 0.076 | 0.131 | −0.088 | 0.136 | |||
−0.088 | 0.093 | 0.063 | 0.075 | −0.062 | 0.068 | −0.091 | 0.148 | 0.082 | 0.129 | −0.111 | 0.157 | |||
0.103 | 0.098 | 0.077 | 0.083 | 0.071 | 0.091 | 0.108 | 0.155 | −0.084 | 0.126 | −0.087 | 0.143 | |||
−0.083 | 0.088 | 0.053 | 0.075 | 0.068 | 0.073 | −0.091 | 0.149 | −0.079 | 0.138 | 0.103 | 0.156 | |||
−0.077 | 0.085 | 0.064 | 0.074 | −0.053 | 0.064 | −0.0.91 | 0.151 | 0.087 | 0.134 | 0.091 | 0.146 | |||
−0.082 | 0.092 | 0.061 | 0.073 | −0.062 | 0.075 | −0.084 | 0.144 | 0.082 | 0.135 | −0.092 | 0.147 | |||
0.103 | 0.155 | −0.088 | 0.141 | −0.096 | 0.137 | |||||||||
0.097 | 0.154 | 0.077 | 0.135 | −0.102 | 0.154 | |||||||||
Par | Par | |||||||||||||
Bias | RMS | Bias | RMS | Bias | RMS | Bias | RMS | Bias | RMS | Bias | RMS | |||
0.053 | 0.116 | 0.043 | 0.111 | 0.054 | 0.115 | 0.086 | 0.152 | 0.088 | 0.156 | 0.103 | 0.156 | |||
0.050 | 0.111 | 0.047 | 0.109 | 0.052 | 0.116 | 0.074 | 0.149 | 0.083 | 0.166 | 0.088 | 0.161 | |||
0.052 | 0.116 | 0.041 | 0.112 | 0.053 | 0.112 | 0.077 | 0.133 | 0.081 | 0.154 | 0.087 | 0.148 | |||
0.066 | 0.110 | 0.053 | 0.106 | 0.053 | 0.113 | 0.082 | 0.154 | 0.087 | 0.157 | 0.112 | 0.157 | |||
0.055 | 0.111 | 0.051 | 0.108 | 0.233 | 0.112 | 0.085 | 0.166 | 0.091 | 0.163 | 0.087 | 0.166 | |||
0.052 | 0.112 | 0.052 | 0.111 | 0.222 | 0.111 | 0.096 | 0.171 | 0.084 | 0.166 | 0.095 | 0.168 | |||
0.051 | 0.115 | 0.046 | 0.104 | 0.057 | 0.123 | 0.054 | 0.141 | 0.078 | 0.141 | 0.077 | 0.153 | |||
0.045 | 0.116 | 0.042 | 0.112 | 0.056 | 0.122 | 0.077 | 0.152 | 0.044 | 0.143 | 0.067 | 0.145 | |||
0.062 | 0.156 | 0.054 | 0.143 | 0.078 | 0.154 | 0.067 | 0.144 | 0.058 | 0.145 | 0.068 | 0.144 | |||
0.077 | 0.161 | 0.071 | 0.151 | 0.088 | 0.155 | 0.066 | 0.138 | 0.071 | 0.144 | 0.061 | 0.143 | |||
0.067 | 0.145 | 0.066 | 0.148 | 0.072 | 0.152 | 0.062 | 0.167 | 0.068 | 0.138 | 0.075 | 0.156 | |||
0.077 | 0.148 | 0.083 | 0.152 | 0.090 | 0.167 | 0.048 | 0.144 | 0.069 | 0.145 | 0.073 | 0.148 | |||
0.074 | 0.150 | 0.071 | 0.161 | 0.076 | 0.161 | 0.067 | 0.156 | 0.059 | 0.144 | 0.076 | 0.149 | |||
0.089 | 0.145 | 0.073 | 0.151 | 0.073 | 0.155 | 0.081 | 0.152 | 0.066 | 0.137 | 0.077 | 0.156 | |||
0.092 | 0.156 | 0.081 | 0.152 | 0.089 | 0.154 | 0.083 | 0.146 | 0.087 | 0.132 | 0.078 | 0.115 | |||
0.082 | 0.155 | 0.087 | 0.153 | 0.077 | 0.151 | 0.049 | 0.123 | 0.067 | 0.112 | 0.061 | 0.135 | |||
0.095 | 0.167 | 0.082 | 0.154 | 0.086 | 0.158 | 0.088 | 0.148 | 0.083 | 0.145 | 0.092 | 0.121 | |||
0.067 | 0.158 | 0.088 | 0.166 | 0.092 | 0.153 |
Par | ||||||
---|---|---|---|---|---|---|
Bias | RMS | Bias | RMS | Bias | RMS | |
0.023 | 0.052 | 0.022 | 0.041 | 0.028 | 0.053 | |
0.026 | 0.054 | 0.024 | 0.039 | 0.027 | 0.054 | |
0.027 | 0.055 | 0.022 | 0.038 | 0.026 | 0.053 | |
0.022 | 0.052 | 0.024 | 0.042 | 0.022 | 0.052 | |
0.021 | 0.051 | 0.021 | 0.038 | 0.027 | 0.055 | |
0.025 | 0.057 | 0.023 | 0.037 | 0.029 | 0.061 | |
0.051 | 0.142 | 0.035 | 0.133 | 0.066 | 0.125 | |
0.047 | 0.121 | 0.029 | 0.118 | 0.058 | 0.111 | |
0.044 | 0.123 | 0.027 | 0.117 | 0.053 | 0.112 | |
0.046 | 0.125 | 0.021 | 0.111 | 0.054 | 0.113 | |
0.052 | 0.144 | 0.037 | 0.139 | 0.68 | 0.129 | |
0.048 | 0.125 | 0.027 | 0.115 | 0.52 | 0.112 | |
0.049 | 0.124 | 0.025 | 0.113 | 0.54 | 0.116 | |
0.048 | 0.126 | 0.024 | 0.112 | 0.51 | 0.111 | |
0.087 | 0.131 | 0.047 | 0.089 | 0.107 | 0.132 | |
0.066 | 0.128 | 0.033 | 0.078 | 0.083 | 0.103 | |
0.091 | 0.133 | 0.051 | 0.104 | 0.091 | 0.128 | |
0.562 | 0.521 | 0.033 | 0.091 | 0.046 | 0.103 | |
0.494 | 0.511 | 0.037 | 0.089 | 0.059 | 0.113 | |
0.326 | 0.367 | 0.106 | 0.126 | 0.044 | 0.073 |
Parameter | With | Without | |||||
---|---|---|---|---|---|---|---|
Estimate | Lower | Upper | Estimate | Lower | Upper | ||
0.891 | 0.886 | 0.893 | 0.897 | 0.895 | 0.901 | ||
0.891 | 0.890 | 0.897 | 0.896 | 0.893 | 0.899 | ||
0.664 | 0.663 | 0.666 | 0.664 | 0.662 | 0.665 | ||
1.211 | 1.207 | 1.238 | 1.194 | 1.189 | 1.207 | ||
0.730 | 0.719 | 0.731 | 0.725 | 0.719 | 0.728 | ||
0.861 | 0.855 | 0.866 | 0.868 | 0.862 | 0.871 | ||
−1.265 | −1.262 | −1.257 | −1.261 | −1.267 | −1.256 | ||
−0.263 | −0.266 | −0.261 | −0.264 | −0.269 | −0.261 | ||
−0.417 | −0.415 | −0.412 | −0.415 | −0.420 | −0.411 | ||
−0.873 | −0.881 | −0.872 | −0.888 | −0.892 | −0.884 | ||
−0.745 | −0.749 | −0.740 | −0.746 | −0.748 | −0.744 | ||
−0.652 | −0.659 | −0.653 | −0.653 | −0.661 | −0.653 | ||
−0.788 | −0.792 | −0.781 | −0.783 | −0.794 | −0.776 | ||
−0.481 | −0.486 | −0.480 | −0.484 | −0.487 | −0.479 | ||
0.532 | 0.528 | 0.535 | 0.541 | 0.539 | 0.543 | ||
−0.025 | −0.026 | −0.023 | −0.025 | −0.025 | −0.024 | ||
0.117 | 0.116 | 0.119 | 0.116 | 0.115 | 0.117 | ||
−3.997 | −4.047 | −3.998 | −4.105 | −4.129 | −4.054 | ||
0.162 | 0.155 | 0.175 | 0.166 | 0.153 | 0.177 | ||
0.133 | 0.128 | 0.135 | 0.131 | 0.128 | 0.135 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tuerde, M.; Muhammadhaji, A. Variational Bayesian Estimation of Quantile Nonlinear Dynamic Latent Variable Models with Possible Nonignorable Missingness. Axioms 2024, 13, 849. https://doi.org/10.3390/axioms13120849
Tuerde M, Muhammadhaji A. Variational Bayesian Estimation of Quantile Nonlinear Dynamic Latent Variable Models with Possible Nonignorable Missingness. Axioms. 2024; 13(12):849. https://doi.org/10.3390/axioms13120849
Chicago/Turabian StyleTuerde, Mulati, and Ahmadjan Muhammadhaji. 2024. "Variational Bayesian Estimation of Quantile Nonlinear Dynamic Latent Variable Models with Possible Nonignorable Missingness" Axioms 13, no. 12: 849. https://doi.org/10.3390/axioms13120849
APA StyleTuerde, M., & Muhammadhaji, A. (2024). Variational Bayesian Estimation of Quantile Nonlinear Dynamic Latent Variable Models with Possible Nonignorable Missingness. Axioms, 13(12), 849. https://doi.org/10.3390/axioms13120849