Stein variational gradient descent with variance reduction

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Probabilistic inference is a common and important task in statistical machine learning. The recently proposed Stein variational gradient descent (SVGD) is a generic Bayesian inference method that has been shown to be successfully applied in a wide range of contexts, especially in dealing with large datasets, where existing probabilistic inference methods have been known to be ineffective. In a large-scale data setting, SVGD employs the mini-batch strategy but its mini-batch estimator has large variance, hence compromising its estimation quality in practice. To this end, we propose in this paper a generic SVGD-based inference method that can significantly reduce the variance of mini-batch estimator when working with large datasets. Our experiments on 14 datasets show that the proposed method enjoys substantial and consistent improvements compared with baseline methods in binary classification task and its pseudo-online learning setting, and regression task. Furthermore, our framework is generic and applicable to a wide range of probabilistic inference problems such as in Bayesian neural networks and Markov random fields.

Original languageEnglish
Title of host publication2020 International Joint Conference on Neural Networks (IJCNN), 2020 Conference Proceedings2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
EditorsAsim Roy
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages8
ISBN (Electronic)9781728169262
ISBN (Print)9781728169279
DOIs
Publication statusPublished - 2020
EventIEEE International Joint Conference on Neural Networks 2020 - Virtual, Glasgow, United Kingdom
Duration: 19 Jul 202024 Jul 2020
https://ieeexplore.ieee.org/xpl/conhome/9200848/proceeding (Proceedings)
https://wcci2020.org/ijcnn-sessions/ (Website)

Conference

ConferenceIEEE International Joint Conference on Neural Networks 2020
Abbreviated titleIJCNN 2020
CountryUnited Kingdom
CityVirtual, Glasgow
Period19/07/2024/07/20
Internet address

Keywords

  • Bayesian inference
  • statistical machine learning
  • variance reduction

Cite this