Models with a large number of latent variables are often used to utilize the information in big or complex data, but can be difficult to estimate. Variational inference methods provide an attractive solution. These methods use an approximation to the posterior density, yet for large latent variable models existing choices can be inaccurate or slow to calibrate. Here, we propose a family of tractable variational approximations that are more accurate and faster to calibrate for this case. It combines a parsimonious approximation for the parameter posterior with the exact conditional posterior of the latent variables. We derive a simplified expression for the re-parameterization gradient of the variational lower bound, which is the main ingredient of optimization algorithms used for calibration. Implementation only requires exact or approximate generation from the conditional posterior of the latent variables, rather than computation of their density. In effect, our method provides a new way to employ Markov chain Monte Carlo (MCMC) within variational inference. We illustrate using two complex contemporary econometric examples. The first is a nonlinear multivariate state space model for U.S. macroeconomic variables. The second is a random coefficients tobit model applied to two million sales by 20,000 individuals in a consumer panel. In both cases, our approximating family is considerably more accurate than mean field or structured Gaussian approximations, and faster than MCMC. Last, we show how to implement data sub-sampling in variational inference for our approximation, further reducing computation time. MATLAB code implementing the method is provided.
- Large consumer panels
- Latent variable models
- Stochastic gradient ascent
- Sub-sampling variational inference
- Time-varying VAR with stochastic volatility