TY - JOUR
T1 - Updating Variational Bayes
T2 - fast sequential posterior inference
AU - Tomasetti, Nathaniel
AU - Forbes, Catherine
AU - Panagiotelis, Anastasios
N1 - Funding Information:
Catherine Forbes acknowledges financial support under the Australian Research Council Discovery Grant No. DP150101728 and the National Science Foundation Grant SES-1921523.
Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2022/2/15
Y1 - 2022/2/15
N2 - Variational Bayesian (VB) methods produce posterior inference in a time frame considerably smaller than traditional Markov Chain Monte Carlo approaches. Although the VB posterior is an approximation, it has been shown to produce good parameter estimates and predicted values when a rich classes of approximating distributions are considered. In this paper, we propose the use of recursive algorithms to update a sequence of VB posterior approximations in an online, time series setting, with the computation of each posterior update requiring only the data observed since the previous update. We show how importance sampling can be incorporated into online variational inference allowing the user to trade accuracy for a substantial increase in computational speed. The proposed methods and their properties are detailed in two separate simulation studies. Additionally, two empirical illustrations are provided, including one where a Dirichlet Process Mixture model with a novel posterior dependence structure is repeatedly updated in the context of predicting the future behaviour of vehicles on a stretch of the US Highway 101.
AB - Variational Bayesian (VB) methods produce posterior inference in a time frame considerably smaller than traditional Markov Chain Monte Carlo approaches. Although the VB posterior is an approximation, it has been shown to produce good parameter estimates and predicted values when a rich classes of approximating distributions are considered. In this paper, we propose the use of recursive algorithms to update a sequence of VB posterior approximations in an online, time series setting, with the computation of each posterior update requiring only the data observed since the previous update. We show how importance sampling can be incorporated into online variational inference allowing the user to trade accuracy for a substantial increase in computational speed. The proposed methods and their properties are detailed in two separate simulation studies. Additionally, two empirical illustrations are provided, including one where a Dirichlet Process Mixture model with a novel posterior dependence structure is repeatedly updated in the context of predicting the future behaviour of vehicles on a stretch of the US Highway 101.
KW - Clustering
KW - Dirichlet process mixture
KW - Forecasting
KW - Importance sampling
KW - Variational inference
UR - http://www.scopus.com/inward/record.url?scp=85120684483&partnerID=8YFLogxK
U2 - 10.1007/s11222-021-10062-2
DO - 10.1007/s11222-021-10062-2
M3 - Article
AN - SCOPUS:85120684483
SN - 0960-3174
VL - 32
JO - Statistics and Computing
JF - Statistics and Computing
IS - 1
M1 - 4
ER -