A Comparison of Some Bayesian and Classical Procedures for Simultaneous Equation Models with Weak Instruments
Abstract
:1. Introduction
2. The Model
3. Review of Some Bayesian Formulations
3.1. Zellner’s Bayesian Method of Moments Approach (BMOM)
3.2. The Geweke Approach
- (1)
- Conditional density of
- (2)
- Conditional density of A
- (3)
- Conditional density of 4
- (4)
- Conditional density of
3.3. The Chao and Phillips Approach
3.4. The Kleibergen and van Dijk Approach
3.5. The Jackknife Instrumental Variable Estimator (JIVE)
4. Posterior Simulator: “Gibbs within M–H” Algorithm
- 0.
- Choose starting values x0
- 1.
- Draw xi from r(x)
- 2.
- Accept xi with probability
- 3.
- . Go to 1.
- 0.
- Choose starting values
- 1.
- Draw from , draw from .
- 2.
- Accept with probability as defined in (24), otherwise .
- 3.
- . Go to 1.
4.1. Implementing the CP Approach
- 0.
- Choose starting values
- 1.
- Draw Σ−1,i fromDraw from
- 2.
- Accept as a drawing from the posterior (14) with probability,
- 3.
- Go to 1.
4.2. Implementing the KVD Approach
- 0.
- Choose starting values
- 1.
- Draw fromDraw from
- 2.
- Perform a singular value decomposition of
- 3.
- Compute according to (18) and (19)
- 4.
- Compute according to (29) and (30)
- 5.
- Draw from
- 6.
- Accept ( as a drawing from the posterior with probability,
- 7.
- . Go to 1.
4.3. Convergence Diagnosis
5. Simulation Results and Discussions
- (1)
- Ordinary least squares (OLS)
- (2)
- Two stage least squares (2SLS)
- (3)
- (4)
- Zellner’s Bayesian method of moments relative to balanced loss function (BMOM)11
- (5)
- Classical LIML. We compute classical LIML as an iterated Aitken estimator (see Pagan (1979) and Gao and Lahiri (2000a)).
- (6)
- (7)
- JIVE.
- (8)
- (9)
- Mode and median of the marginal density of β based on classical LIML from Gibbs sampling (LIML-GS). LIML-GS is a byproduct of the “Gibbs within M–H” algorithm for the CP approach since the likelihood function is used as the candidate-generating density to explore the CP posterior.
- (10)
- Posterior mode and median from CP approach using “Gibbs within M–H” algorithm.
- (11)
- Posterior mode and median from KVD approach using “Gibbs within M–H” algorithm.
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A
References
- Ackerberg, Daniel A., and Paul J. Devereux. 2006. Comment on case against JIVE. Journal of Applied Econometrics 21: 835–38. [Google Scholar] [CrossRef]
- Andrews, Donald W. K., and James H. Stock. 2005. Inference with weak instruments. In Advances in Economics and Econometrics, Theory and Applications. Edited by Richard Blundell, Whitney K. Newey and Torsen Persson. Ninth World Congress of the Econometric Society. Cambridge: Cambridge University Press, vol. III. [Google Scholar]
- Andrews, Donald W. K., James H. Stock, and Liang Sun. 2019. Weak Instruments in IV Regression: Theory and Practice. Annual Review of Economics. forthcoming. [Google Scholar]
- Angrist, Joshua D., Guido W. Imbens, and Alan Krueger. 1999. Jackknife instrumental variables estimation. Journal of Applied Econometrics 14: 57–67. [Google Scholar] [CrossRef]
- Billingsley, Patrik. 1986. Probability and Measure. New York: Wiley. [Google Scholar]
- Blomquist, Sören, and Matz Dahlberg. 1999. Small sample properties of LIML and jackknife IV estimators: Experiments with weak instruments. Journal of Applied Econometrics 14: 69–88. [Google Scholar] [CrossRef]
- Blomquist, Sören, and Matz Dahlberg. 2006. The case against Jive: A comment. Journal of Applied Econometrics 21: 839–41. [Google Scholar] [CrossRef]
- Bound, John, David A. Jaeger, and Regina M. Baker. 1995. Problems with instrumental variables estimation when the correlation between the instruments and the endogenous explanatory variable is weak. Journal of the American Statistical Association 90: 443–50. [Google Scholar] [CrossRef]
- Brooks, Stephen P. 1996. Quantitative Convergence Diagnosis for MCMC via CUSUMS. Technical Report. Bristol: University of Bristol. [Google Scholar]
- Brooks, Stephen P., and Gareth O. Roberts. 1998. Assessing convergence of Markov chain Monte Carlo algorithms. Statistics and Computing 8: 319–35. [Google Scholar] [CrossRef]
- Buse, Adolf. 1992. The bias of instrumental variable estimators. Econometrica 60: 173–80. [Google Scholar] [CrossRef]
- Casella, George, and Edward I. George. 1992. Explaining the Gibbs sampler. The American Statistician 46: 167–74. [Google Scholar]
- Chao, John C., and Peter C. B. Phillips. 1998. Posterior distributions in limited information analysis of the simultaneous equations model using the Jeffreys prior. Journal of Econometrics 87: 49–86. [Google Scholar] [CrossRef]
- Chib, Siddhartha, and Edward Greenberg. 1995. Understanding the Metropolis–Hastings algorithm. The American Statistician 49: 327–35. [Google Scholar]
- Chib, Siddhartha, and Edward Greenberg. 1996. Markov chain Monte Carlo simulation methods in econometrics. Econometric Theory 12: 409–31. [Google Scholar] [CrossRef]
- Conley, Timothy G., Christian B. Hansen, Robert McCulloch, and Peter E. Rossi. 2008. A semi-parametric Bayesian approach to the instrumental variable problem. Journal of Econometrics 144: 276–305. [Google Scholar] [CrossRef]
- Cowles, Mary K., and Bradley P. Carlin. 1996. Markov chain Monte Carlo convergence diagnosis: A comparative review. Journal of the American Statistical Association 91: 883–904. [Google Scholar] [CrossRef]
- Davidson, Russell, and James G. MacKinnon. 2006a. The case against JIVE. Journal of Applied Econometrics 21: 827–33. [Google Scholar] [CrossRef]
- Davidson, Russell, and James G. MacKinnon. 2006b. The case against JIVE: Reply. Journal of Applied Econometrics 21: 843–44. [Google Scholar] [CrossRef]
- Drèze, Jacques H. 1976. Bayesian limited information analysis of the simultaneous equations model. Econometrica 44: 1045–75. [Google Scholar] [CrossRef]
- Drèze, Jacques H., and Juan-Antonio A. Morales. 1976. Bayesian full information analysis of simultaneous equations. Journal of American Statistical Association 71: 329–54. [Google Scholar]
- Drèze, Jacques H., and Jean François Richard. 1983. Bayesian analysis of simultaneous equation systems. In Handbook of Econometrics. Edited by Zvi Griliches and Michael Intriligator. Amsterdam: North Holland. [Google Scholar]
- Dwivedi, Tryambakeshwar D., and Virendra K. Srivastava. 1984. Exact finite sample properties of double k-class estimators in simultaneous equations. Journal of Econometrics 25: 263–83. [Google Scholar] [CrossRef]
- Fuller, Wayne. A. 1977. Some properties of a modification of the limited information estimator. Econometrica 45: 939–53. [Google Scholar] [CrossRef]
- Gao, Chuanming, and Kajal Lahiri. 2000a. Further consequences of viewing LIML as an iterated Aitken estimator. Journal of Econometrics 98: 187–202. [Google Scholar] [CrossRef]
- Gao, Chuanming, and Kajal Lahiri. 2000b. MCMC algorithms for two recent Bayesian limited information estimators. Economics Letters 66: 121–26. [Google Scholar] [CrossRef]
- Gao, Chuanming, and Kajal Lahiri. 2001. A Note on the double k-class estimator in simultaneous equations. Journal of Econometrics 108: 101–11. [Google Scholar] [CrossRef]
- Geweke, John. 1996. Bayesian reduced rank regression in econometrics. Journal of Econometrics 75: 121–46. [Google Scholar] [CrossRef]
- Hastings, Willeen K. 1970. Monte Carlo sampling methods using Markov chains and their applications. Biometrica 57: 97–109. [Google Scholar] [CrossRef]
- Kleibergen, Frank. 1997. Equality Restricted Random Variables: Densities and Sampling Algorithms. Econometric Institute Report 9662/A. Rotterdam: Erasmus University Rotterdam. [Google Scholar]
- Kleibergen, Frank. 1998. Conditional Densities in Econometrics. Econometric Institute Research Papers EI 9853, Erasmus School of Economics (ESE), Discussion Paper. Rotterdam: Erasmus University Rotterdam. [Google Scholar]
- Kleibergen, Frank, and Herman K. van Dijk. 1998. Bayesian simultaneous equation analysis using reduced rank structures. Econometric Theory 14: 701–43. [Google Scholar] [CrossRef]
- Kleibergen, Frank, and Eric Zivot. 2003. Bayesian and classical approaches to in-strumental variable regression. Journal of Econometrics 114: 29–72. [Google Scholar] [CrossRef]
- MacEachern, Steven N., and L. Mark Berliner. 1994. Subsampling the Gibbs sampler. The American Statistician 48: 188–90. [Google Scholar]
- Maddala, Gangadharrao S. 1976. Weak priors and sharp posteriors in simultaneous equation models. Econometrica 44: 345–51. [Google Scholar] [CrossRef]
- Maddala, Gangadharrao S., and Jinook Jeong. 1992. On the exact small sample distribution of the instrumental variable estimator. Econometrica 60: 181–83. [Google Scholar] [CrossRef]
- Mariano, Roberto S., and James B. McDonald. 1979. A note on the distribution functions of LIML and 2SLS structural coefficient in exactly identified case. Journal of the American Statistical Association 74: 847–48. [Google Scholar] [CrossRef]
- Metropolis, Nicholas, Arianna. W. Rosenbluth, Marshall N. Rosenbluth, Augusta H. Teller, and Edward Teller. 1953. Equations of state calculations by fast computing machines. Journal of Chemical Physics 21: 1087–92. [Google Scholar] [CrossRef]
- Ni, Shawn, and Dongchu Sun. 2003. Noninformative priors and frequentist risks of Bayesian estimators of vector-autoregressive models. Journal of Econometrics 115: 159–97. [Google Scholar] [CrossRef]
- Ni, Shawn, Dongchu Sun, and Xiaogian Sun. 2007. Intrinsic Bayesian estimation of vector autoregression impulse responses. Journal of Business & Economic Statistics 25: 163–76. [Google Scholar]
- Pagan, Adrian R. 1979. Some consequences of viewing LIML as an iterated Aitken estimator. Economics Letters 3: 269–372. [Google Scholar] [CrossRef]
- Percy, David F. 1992. Prediction for seemingly unrelated regressions. Journal of the Royal Statistical Society B 54: 243–52. [Google Scholar] [CrossRef]
- Poirier, Dale J. 1995. Intermediate Statistics and Econometrics. Cambridge: MIT Press. [Google Scholar]
- Poirier, Dale J. 1996. Prior beliefs about fit. In Bayesian Statistics 5: Proceedings of the Fifth Valencia International Meeting. Edited by Jose M. Bernardo, James O. Berger, A. Philip Dawid and Adrian F. M. Smith. Oxford: Clarendon Press. [Google Scholar]
- Radchenko, Stanislav, and Hiroki Tsurumi. 2006. Limited information Bayesian analysis of a simultaneous equation with an autocorrelated error term and its application to the U.S. gasoline market. Journal of Econometrics 133: 31–49. [Google Scholar] [CrossRef]
- Raftery, Adrian E., and Stephen M. Lewis. 1992. How many iterations in the Gibbs sampler? In Bayesian Statistics 4. Proceedings of the Fourth Valencia International Meeting. Edited by Jose M. Bernardo, Adrian F. M. Smith, A. Philip Dawid and James O. Berger. Oxford: Oxford University Press. [Google Scholar]
- Smith, Adrian F. M., and Gareth O. Roberts. 1993. Bayesian computation via the Gibbs sampler and related Markov chain Monte Carlo methods. Journal of the Royal Statistical Society B 55: 3–23. [Google Scholar] [CrossRef]
- Staiger, Douglas, and James H. Stock. 1997. Instrumental variables regression with weak instruments. Econometrica 65: 557–86. [Google Scholar] [CrossRef]
- Tierney, Luke. 1994. Markov chains for exploring posterior distributions. Annals of Statistics 22: 1701–67. [Google Scholar] [CrossRef]
- Tsurumi, Hiroki. 1990. Comparing Bayesian and non-Bayesian limited information estimators. In Bayesian and Likelihood Methods in Statistics and Econometrics. Edited by Seymour Geisser, James S. Hodges, S. James Press and Arnold Zellner. Amsterdam: North-Holland. [Google Scholar]
- Young, Alwyn. 2019. Consistency without Inference: Instrumental Variables in Practical Application. London: London School of Economics. [Google Scholar]
- Yu, Bin, and Per Mykland. 1998. Looking at Markov samplers through Cusum path plots: A simple diagnostic idea. Statistics and Computing 8: 275–86. [Google Scholar] [CrossRef]
- Zellner, Arnold. 1971. An Introduction to Bayesian Inference in Econometrics. New York: Wiley. [Google Scholar]
- Zellner, Arnold. 1978. Estimation of functions of population means and regression coefficients: A minimum expected loss (MELO) approach. Journal of Econometrics 8: 127–58. [Google Scholar] [CrossRef]
- Zellner, Arnold. 1986. Further results on Bayesian minimum expected loss (MELO) estimates and posterior distributions for structural coefficients. In Advances in Econometrics. Edited by Daniel L. Slottje. Amsterdam: Elsevier, vol. 5, pp. 171–82. [Google Scholar]
- Zellner, Arnold. 1994. Bayesian and Non-Bayesian estimation using balanced loss functions. In Statistical Decision Theory and Related Topics. Edited by Shanti S. Gupta and James O. Berger. New York: Springer, vol. V, chp. 28. pp. 377–90. [Google Scholar]
- Zellner, Arnold. 1998. The finite sample properties of simultaneous equations’ estimates and estimators: Bayesian and non-Bayesian approaches. Journal of Econometrics 83: 185–212. [Google Scholar] [CrossRef]
- Zellner, Arnold, Luc Bauwens, and Harman K. van Dijk. 1988. Bayesian specification analysis and estimation of simultaneous equation models using Monte Carlo methods. Journal of Econometrics 38: 39–72. [Google Scholar] [CrossRef] [Green Version]
- Zellner, Arnold, Tomohiro Ando, Nalan Baştürk, Lennart Hoogerheide, and Herman K. van Dijk. 2014. Bayesian analysis of instrumental variable models: Acceptance-rejection within direct Monte Carlo. Econometric Reviews 33: 3–35. [Google Scholar] [CrossRef]
1 | Zellner (1998) and Zellner et al. (2014) contain a comprehensive review of the finite sample properties of SEM estimators, and emphasize the need for finite sample optimal estimation procedure for such models. Andrews and Stock (2005) reviews recent developments in methods that deal with weak in IV regression models, and presents new testing results under “many weak-IV asymptotics”. |
2 | There has been a lot of interest in the estimation of LISEM with weak instruments. See Buse (1992); Bound et al. (1995); Staiger and Stock (1997); Angrist et al. (1999); Blomquist and Dahlberg (1999), among others. More recently, Andrews et al. (2019) review the literature on weak instruments in linear IV regression, and suggest that weak instruments remain an important issue in empirical practice. |
3 | |
4 | The expressions for the conditional densities of Π2 and β given in (Geweke 1996, expressions (11) and (13)) contain some typographical errors and are corrected here in (11) and (12). |
5 | Note that this formulation or the singular value decomposition does not change the identification status of the LISEM specified by (1) and (2). If is locally nonidentified. |
6 | |
7 | Zellner et al. (2014) suggested a variant of this approach called Acceptance-Rejection within Direct Monte Carlo (ARDMC) to evaluate the posterior density, and report substantial gain in computational efficiency, particularly with weak instruments. They also studied the existence conditions for posterior moments of the parameters of interest in terms of the number of available instruments being greater than the number of endogenous variables plus the order of the moment. |
8 | Gao and Lahiri (2000b) illustrated the algorithm empirically with a simple labor supply model. |
9 | See also Kleibergen (1997, 1998). Note that their claimed relationship that |J(Φ, (Π2, β, λ))| ≥ |J(Φ, (Π2, β, λ))|λ=0 is analytically incorrect; see the Appendix A for proof. |
10 | In practice, there is often a concern about possible underestimation of true length of the burn-in period using the Raftery and Lewis method if the quantile of interest is not properly pre-prescribed, see Brooks and Roberts (1998). |
11 | |
12 | |
13 | We do not report cases with |ρ| = 0.99 or 1. As pointed out by Maddala and Jeong (1992), when the instruments are weak and |ρ| is very close to one, the exact finite sample distribution of IV estimator is bimodal. Our experiments show that the marginal posterior density of β from the Bayesian approaches exhibits a similar pattern. |
14 | Denote . Using , we have , , and . Letting , the second relationship may be rewritten as:
|
15 | Medians were also calculated. Since they were very close to the corresponding means in all our experiments, we did not report them in this paper. |
16 | When k2 = (m − 1), a diffuse prior in (20) for the linear model implies that the prior for the parameters of the LISEM (4) is
|
17 | Note that the relationship between the standardized parameter vector and the original parameter vector involves the nuisance parameters, cf. Chao and Phillips (1998). However, when a SEM is in orthonormal canonical form (i.e., the exogenous regressors are orthonormal and the disturbance covariance matrix Ω is an identity matrix), both the density of random parameter β from the CP approach and the probability density of the classical LIML estimator for β are conditional on these information. |
18 | Ackerberg and Devereux (2006) and Blomquist and Dahlberg (2006) have suggested some ad hoc adjustments to the original JIVE formula to improve its performance. |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.348 | 0.089 | 0.359 | 0.348 |
2SLS | 1.045 | 0.144 | 0.151 | 0.121 |
MELO | 1.115 | 0.126 | 0.171 | 0.144 |
BMOM | 0.967 | 0.127 | 0.131 | 0.102 |
LIML | 0.998 | 0.152 | 0.152 | 0.118 |
Fuller1 | 1.015 | 0.147 | 0.148 | 0.116 |
Fuller4 | 1.061 | 0.136 | 0.149 | 0.120 |
JIVE | 0.957 | 0.178 | 0.183 | 0.141 |
Geweke_Mode | 1.056 | 0.140 | 0.151 | 0.122 |
Geweke_Median | 1.031 | 0.143 | 0.146 | 0.116 |
LIML_GS_Mode | 1.061 | 0.139 | 0.152 | 0.123 |
LIML_GS_Median | 1.036 | 0.142 | 0.146 | 0.116 |
CP_Mode | 1.046 | 0.144 | 0.151 | 0.121 |
CP_Median | 1.021 | 0.145 | 0.147 | 0.115 |
KVD_Mode | 1.090 | 0.148 | 0.173 | 0.143 |
KVD_Median | 1.079 | 0.137 | 0.158 | 0.130 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.537 | 0.111 | 0.548 | 0.537 |
2SLS | 1.030 | 0.345 | 0.346 | 0.267 |
MELO | 1.173 | 0.262 | 0.314 | 0.248 |
BMOM | 0.881 | 0.264 | 0.290 | 0.229 |
LIML | 1.030 | 0.345 | 0.346 | 0.267 |
Fuller1 | 1.107 | 0.300 | 0.319 | 0.245 |
Fuller4 | 1.250 | 0.219 | 0.332 | 0.277 |
JIVE | 0.803 | 0.491 | 0.529 | 0.409 |
Geweke_Mode | 1.089 | 0.331 | 0.343 | 0.265 |
Geweke_Median | 0.907 | 0.518 | 0.526 | 0.358 |
LIML_GS_Mode | 1.091 | 0.313 | 0.326 | 0.255 |
LIML_GS_Median | 0.778 | 1.386 | 1.404 | 0.592 |
CP_Mode | 1.108 | 0.309 | 0.327 | 0.256 |
CP_Median | 0.797 | 1.383 | 1.398 | 0.580 |
KVD_Mode | n.a. | n.a. | n.a. | n.a. |
KVD_Median | n.a. | n.a. | n.a. | n.a. |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.539 | 0.111 | 0.550 | 0.539 |
2SLS | 1.231 | 0.279 | 0.362 | 0.296 |
MELO | 1.366 | 0.186 | 0.411 | 0.368 |
BMOM | 0.943 | 0.184 | 0.193 | 0.154 |
LIML | 1.043 | 0.579 | 0.581 | 0.386 |
Fuller1 | 1.143 | 0.367 | 0.394 | 0.307 |
Fuller4 | 1.281 | 0.244 | 0.372 | 0.307 |
JIVE | 0.816 | 0.568 | 0.597 | 0.474 |
Geweke_Mode | 1.244 | 0.287 | 0.377 | 0.309 |
Geweke_Median | 1.204 | 0.309 | 0.370 | 0.300 |
LIML_GS_Mode | 1.260 | 0.268 | 0.373 | 0.308 |
LIML_GS_Median | 1.220 | 0.298 | 0.370 | 0.300 |
CP_Mode | 1.230 | 0.293 | 0.372 | 0.301 |
CP_Median | 1.194 | 0.315 | 0.370 | 0.298 |
KVD_Mode | 1.351 | 0.384 | 0.520 | 0.389 |
KVD_Median | 1.381 | 0.367 | 0.529 | 0.405 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.535 | 0.111 | 0.546 | 0.535 |
2SLS | 1.363 | 0.221 | 0.425 | 0.371 |
MELO | 1.463 | 0.139 | 0.483 | 0.463 |
BMOM | 0.969 | 0.132 | 0.136 | 0.106 |
LIML | 1.090 | 0.864 | 0.869 | 0.534 |
Fuller1 | 1.182 | 0.479 | 0.512 | 0.366 |
Fuller4 | 1.302 | 0.291 | 0.419 | 0.333 |
JIVE | 0.706 | 0.933 | 0.978 | 0.728 |
Geweke_Mode | 1.357 | 0.239 | 0.430 | 0.367 |
Geweke_Median | 1.350 | 0.245 | 0.427 | 0.361 |
LIML_GS_Mode | 1.375 | 0.218 | 0.328 | 0.380 |
LIML_GS_Median | 1.367 | 0.228 | 0.432 | 0.374 |
CP_Mode | 1.215 | 0.629 | 0.665 | 0.466 |
CP_Median | 1.255 | 0.388 | 0.464 | 0.346 |
KVD_Mode | 1.550 | 0.376 | 0.666 | 0.556 |
KVD_Median | 1.573 | 0.322 | 0.657 | 0.576 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.538 | 0.077 | 0.543 | 0.538 |
2SLS | 1.138 | 0.208 | 0.250 | 0.200 |
MELO | 1.257 | 0.156 | 0.301 | 0.264 |
BMOM | 0.954 | 0.156 | 0.163 | 0.127 |
LIML | 1.023 | 0.280 | 0.281 | 0.210 |
Fuller1 | 1.069 | 0.250 | 0.259 | 0.197 |
Fuller4 | 1.171 | 0.195 | 0.259 | 0.209 |
JIVE | 0.914 | 0.320 | 0.331 | 0.262 |
Geweke_Mode | 1.149 | 0.215 | 0.262 | 0.208 |
Geweke_Median | 1.111 | 0.228 | 0.254 | 0.198 |
LIML_GS_Mode | 1.162 | 0.205 | 0.261 | 0.209 |
LIML_GS_Median | 1.117 | 0.225 | 0.254 | 0.199 |
CP_Mode | 1.155 | 0.207 | 0.259 | 0.206 |
CP_Median | 1.107 | 0.228 | 0.252 | 0.196 |
KVD_Mode | 1.233 | 0.205 | 0.310 | 0.258 |
KVD_Median | 1.215 | 0.210 | 0.301 | 0.243 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.542 | 0.078 | 0.548 | 0.542 |
2SLS | 1.258 | 0.197 | 0.325 | 0.274 |
MELO | 1.376 | 0.134 | 0.399 | 0.376 |
BMOM | 0.972 | 0.132 | 0.135 | 0.110 |
LIML | 1.003 | 0.437 | 0.437 | 0.291 |
Fuller1 | 1.071 | 0.311 | 0.319 | 0.243 |
Fuller4 | 1.180 | 0.233 | 0.294 | 0.232 |
JIVE | 0.927 | 0.408 | 0.414 | 0.333 |
Geweke_Mode | 1.253 | 0.201 | 0.323 | 0.269 |
Geweke_Median | 1.238 | 0.206 | 0.315 | 0.261 |
LIML_GS_Mode | 1.265 | 0.196 | 0.330 | 0.278 |
LIML_GS_Median | 1.247 | 0.202 | 0.319 | 0.266 |
CP_Mode | 1.196 | 0.264 | 0.329 | 0.266 |
CP_Median | 1.192 | 0.232 | 0.301 | 0.240 |
KVD_Mode | 1.371 | 0.278 | 0.464 | 0.382 |
KVD_Median | 1.395 | 0.269 | 0.478 | 0.397 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.565 | 0.080 | 0.571 | 0.565 |
2SLS | 1.254 | 0.282 | 0.380 | 0.309 |
MELO | 1.376 | 0.184 | 0.419 | 0.379 |
BMOM | 0.953 | 0.183 | 0.189 | 0.150 |
LIML | 1.052 | 0.584 | 0.586 | 0.392 |
Fuller1 | 1.158 | 0.377 | 0.409 | 0.307 |
Fuller4 | 1.296 | 0.244 | 0.384 | 0.317 |
JIVE | 0.833 | 0.638 | 0.659 | 0.527 |
Geweke_Mode | 1.264 | 0.285 | 0.388 | 0.314 |
Geweke_Median | 1.224 | 0.316 | 0.387 | 0.305 |
LIML_GS_Mode | 1.274 | 0.283 | 0.394 | 0.320 |
LIML_GS_Median | 1.232 | 0.310 | 0.387 | 0.306 |
CP_Mode | 1.263 | 0.295 | 0.395 | 0.318 |
CP_Median | 1.223 | 0.316 | 0.387 | 0.304 |
KVD_Mode | 1.388 | 0.389 | 0.549 | 0.418 |
KVD_Median | 1.394 | 0.315 | 0.504 | 0.414 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.574 | 0.076 | 0.579 | 0.574 |
2SLS | 1.386 | 0.219 | 0.444 | 0.394 |
MELO | 1.478 | 0.131 | 0.496 | 0.478 |
BMOM | 0.979 | 0.129 | 0.131 | 0.105 |
LIML | 1.139 | 0.882 | 0.893 | 0.545 |
Fuller1 | 1.224 | 0.477 | 0.527 | 0.389 |
Fuller4 | 1.335 | 0.280 | 0.437 | 0.358 |
JIVE | 0.844 | 0.823 | 0.838 | 0.663 |
Geweke_Mode | 1.385 | 0.243 | 0.455 | 0.395 |
Geweke_Median | 1.380 | 0.246 | 0.453 | 0.390 |
LIML_GS_Mode | 1.397 | 0.230 | 0.459 | 0.404 |
LIML_GS_Median | 1.387 | 0.236 | 0.453 | 0.396 |
CP_Mode | 1.338 | 0.465 | 0.575 | 0.433 |
CP_Median | 1.337 | 0.311 | 0.459 | 0.376 |
KVD_Mode | 1.584 | 0.462 | 0.745 | 0.592 |
KVD_Median | 1.608 | 0.368 | 0.711 | 0.610 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.172 | 0.090 | 0.194 | 0.174 |
2SLS | 1.046 | 0.253 | 0.257 | 0.206 |
MELO | 1.083 | 0.189 | 0.206 | 0.164 |
BMOM | 0.859 | 0.190 | 0.237 | 0.195 |
LIML | 1.017 | 0.333 | 0.333 | 0.260 |
Fuller1 | 1.029 | 0.298 | 0.299 | 0.236 |
Fuller4 | 1.059 | 0.235 | 0.242 | 0.192 |
JIVE | 0.957 | 0.417 | 0.419 | 0.340 |
Geweke_Mode | 1.053 | 0.251 | 0.257 | 0.200 |
Geweke_Median | 1.041 | 0.267 | 0.270 | 0.214 |
LIML_GS_Mode | 1.058 | 0.244 | 0.251 | 0.197 |
LIML_GS_Median | 1.044 | 0.265 | 0.269 | 0.212 |
CP_Mode | 1.054 | 0.255 | 0.261 | 0.205 |
CP_Median | 1.040 | 0.271 | 0.274 | 0.218 |
KVD_Mode | 1.131 | 0.368 | 0.391 | 0.237 |
KVD_Median | 1.161 | 0.328 | 0.365 | 0.245 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.179 | 0.096 | 0.203 | 0.181 |
2SLS | 1.085 | 0.214 | 0.230 | 0.182 |
MELO | 1.124 | 0.146 | 0.192 | 0.154 |
BMOM | 0.823 | 0.143 | 0.228 | 0.193 |
LIML | 0.992 | 0.397 | 0.397 | 0.301 |
Fuller1 | 1.015 | 0.347 | 0.347 | 0.270 |
Fuller4 | 1.055 | 0.267 | 0.273 | 0.216 |
JIVE | 0.991 | 0.481 | 0.481 | 0.390 |
Geweke_Mode | 1.084 | 0.218 | 0.234 | 0.184 |
Geweke_Median | 1.079 | 0.223 | 0.237 | 0.187 |
LIML_GS_Mode | 1.087 | 0.212 | 0.229 | 0.181 |
LIML_GS_Median | 1.082 | 0.218 | 0.233 | 0.185 |
CP_Mode | 1.054 | 0.308 | 0.313 | 0.223 |
CP_Median | 1.063 | 0.254 | 0.262 | 0.207 |
KVD_Mode | 1.249 | 0.234 | 0.342 | 0.283 |
KVD_Median | 1.286 | 0.235 | 0.370 | 0.308 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.846 | 0.052 | 0.848 | 0.846 |
2SLS | 1.359 | 0.180 | 0.402 | 0.363 |
MELO | 1.572 | 0.118 | 0.584 | 0.572 |
BMOM | 1.057 | 0.118 | 0.131 | 0.102 |
LIML | 0.988 | 0.404 | 0.404 | 0.255 |
Fuller1 | 1.169 | 0.196 | 0.259 | 0.221 |
Fuller4 | 1.417 | 0.120 | 0.434 | 0.417 |
JIVE | 0.637 | 0.611 | 0.711 | 0.478 |
Geweke_Mode | 1.347 | 0.302 | 0.460 | 0.358 |
Geweke_Median | 1.277 | 0.377 | 0.468 | 0.305 |
LIML_GS_Mode | 1.338 | 0.155 | 0.372 | 0.345 |
LIML_GS_Median | 1.252 | 0.194 | 0.318 | 0.281 |
CP_Mode | 1.314 | 0.162 | 0.353 | 0.325 |
CP_Median | 1.234 | 0.194 | 0.304 | 0.266 |
KVD_Mode | 1.411 | 0.379 | 0.559 | 0.428 |
KVD_Median | 1.462 | 0.463 | 0.654 | 0.514 |
Mean | Std | RMSE | MAD | |
---|---|---|---|---|
OLS | 1.850 | 0.033 | 0.851 | 0.850 |
2SLS | 1.230 | 0.126 | 0.262 | 0.234 |
MELO | 1.414 | 0.094 | 0.425 | 0.414 |
BMOM | 1.044 | 0.095 | 0.105 | 0.082 |
LIML | 1.025 | 0.170 | 0.172 | 0.132 |
Fuller1 | 1.095 | 0.142 | 0.171 | 0.143 |
Fuller4 | 1.264 | 0.099 | 0.282 | 0.265 |
JIVE | 0.873 | 0.199 | 0.236 | 0.191 |
Geweke_Mode | 1.216 | 0.117 | 0.246 | 0.223 |
Geweke_Median | 1.150 | 0.127 | 0.197 | 0.172 |
LIML_GS_Mode | 1.227 | 0.118 | 0.256 | 0.235 |
LIML_GS_Median | 1.158 | 0.128 | 0.203 | 0.180 |
CP_Mode | 1.221 | 0.116 | 0.250 | 0.228 |
CP_Median | 1.154 | 0.127 | 0.200 | 0.176 |
KVD_Mode | 1.258 | 0.207 | 0.331 | 0.280 |
KVD_Median | 1.252 | 0.294 | 0.387 | 0.260 |
Mean | Std | RMSE | MAD | Remarks | |
---|---|---|---|---|---|
T = 50, ρ = −0.60, k2 = 4, R2 = 0.40 | |||||
BMOM | 0.852 | 0.129 | 0.196 | 0.165 | Compare Table 1. |
KVD_Mode | 0.971 | 0.150 | 0.152 | 0.119 | Acceptance rate for |
KVD_Median | 0.999 | 0.153 | 0.152 | 0.119 | KVD: 0.713 (0.130) |
T = 50, ρ = −0.60, k2 = 4, R2 = 0.10 | |||||
BMOM | 0.551 | 0.191 | 0.488 | 0.453 | Compare Table 3. |
KVD_Mode | 0.851 | 0.327 | 0.359 | 0.271 | Acceptance rate for |
KVD_Median | 0.934 | 0.341 | 0.347 | 0.267 | KVD: 0.680 (0.133) |
T = 50, ρ = −0.60, k2 = 9, R2 = 0.10 | |||||
BMOM | 0.420 | 0.136 | 0.600 | 0.580 | Compare Table 4. |
KVD_Mode | 0.857 | 0.367 | 0.393 | 0.296 | Acceptance rate for |
KVD_Median | 0.927 | 0.399 | 0.406 | 0.291 | KVD: 0.482 (0.155) |
T = 100, ρ = −0.60, k2 = 4, R2 = 0.10 | |||||
BMOM | 0.676 | 0.160 | 0.362 | 0.326 | Compare Table 5. |
KVD_Mode | 0.901 | 0.213 | 0.235 | 0.186 | Acceptance rate for |
KVD_Median | 0.964 | 0.237 | 0.239 | 0.190 | KVD: 0.772 (0.110) |
T = 100, ρ = −0.60, k2 = 9, R2 = 0.10 | |||||
BMOM | 0.531 | 0.129 | 0.486 | 0.469 | Compare Table 6. |
KVD_Mode | 0.903 | 0.240 | 0.258 | 0.200 | Acceptance rate for |
KVD_Median | 0.952 | 0.247 | 0.252 | 0.198 | KVD: 0.614 (0.138) |
T = 100, ρ = −0.60, k2 = 4, R2 = 0.05 | |||||
BMOM | 0.514 | 0.181 | 0.519 | 0.486 | Compare Table 7. |
KVD_Mode | 0.813 | 0.306 | 0.358 | 0.285 | Acceptance rate for |
KVD_Median | 0.908 | 0.362 | 0.373 | 0.287 | KVD: 0.720 (0.128) |
T = 100, ρ = −0.60, k2 = 9, R2 = 0.05 | |||||
BMOM | 0.407 | 0.131 | 0.608 | 0.593 | Compare Table 8. |
KVD_Mode | 0.848 | 0.424 | 0.450 | 0.312 | Acceptance rate for |
KVD_Median | 0.907 | 0.349 | 0.361 | 0.275 | KVD: 0.585 (0.144) |
T = 100, ρ = −0.20, k2 = 4, R2 = 0.10 | |||||
BMOM | 0.753 | 0.195 | 0.314 | 0.266 | Compare Table 9. |
KVD_Mode | 1.002 | 0.267 | 0.267 | 0.208 | Acceptance rate for |
KVD_Median | 1.037 | 0.291 | 0.293 | 0.218 | KVD: 0.699 (0.162) |
T = 100, ρ = −0.20, k2 = 9, R2 = 0.10 | |||||
BMOM | 0.673 | 0.159 | 0.364 | 0.328 | Compare Table 10. |
KVD_Mode | 1.093 | 0.318 | 0.331 | 0.233 | Acceptance rate for |
KVD_Median | 1.129 | 0.279 | 0.307 | 0.241 | KVD: 0.553 (0.181) |
T = 50, ρ = −0.95, k2 = 4, R2 = 0.10 | |||||
BMOM | 0.427 | 0.120 | 0.585 | 0.573 | Compare Table 11. |
KVD_Mode | 0.737 | 0.244 | 0.359 | 0.312 | Acceptance rate for |
KVD_Median | 0.836 | 0.246 | 0.295 | 0.239 | KVD: 0.173 (0.112) |
T = 100, ρ = −0.95, k2 = 4, R2 = 0.10 | |||||
BMOM | 0.589 | 0.097 | 0.422 | 0.411 | Compare Table 12. |
KVD_Mode | 0.815 | 0.155 | 0.241 | 0.209 | Acceptance rate for |
KVD_Median | 0.889 | 0.153 | 0.189 | 0.156 | KVD: 0.179 (0.103) |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gao, C.; Lahiri, K. A Comparison of Some Bayesian and Classical Procedures for Simultaneous Equation Models with Weak Instruments. Econometrics 2019, 7, 33. https://doi.org/10.3390/econometrics7030033
Gao C, Lahiri K. A Comparison of Some Bayesian and Classical Procedures for Simultaneous Equation Models with Weak Instruments. Econometrics. 2019; 7(3):33. https://doi.org/10.3390/econometrics7030033
Chicago/Turabian StyleGao, Chuanming, and Kajal Lahiri. 2019. "A Comparison of Some Bayesian and Classical Procedures for Simultaneous Equation Models with Weak Instruments" Econometrics 7, no. 3: 33. https://doi.org/10.3390/econometrics7030033
APA StyleGao, C., & Lahiri, K. (2019). A Comparison of Some Bayesian and Classical Procedures for Simultaneous Equation Models with Weak Instruments. Econometrics, 7(3), 33. https://doi.org/10.3390/econometrics7030033