Generalized robust Bayesian committee machine for large-scale Gaussian process regression

Haitao Liu, Jianfei Cai, Yi Wang, Yew-Soon Ong

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

15 Citations (Scopus)

Abstract

In order to scale standard Gaussian process (GP) regression to large-scale datasets, aggregation models employ factorized training process and then combine predictions from distributed experts. The state-of-the-art aggregation models, however, either provide inconsistent predictions or require time-consuming aggregation process. We first prove the inconsistency of typical aggregations using disjoint or random data partition, and then present a consistent yet efficient aggregation model for large-scale GP. The proposed model inherits the advantages of aggregations, e.g., closed-form inference and aggregation, par- allelization and distributed computing. Furthermore, theoretical and empirical analyses reveal that the new aggregation model performs better due to the consistent predictions that converge to the true underlying function when the training size approaches infinity.

Original languageEnglish
Title of host publicationProceedings of Machine Learning Research
Subtitle of host publicationInternational Conference on Machine Learning, 10-15 July 2018, Stockholmsmässan, Stockholm Sweden
EditorsJennifer Dy, Andreas Krause
Place of PublicationStroudsburg PA USA
PublisherInternational Machine Learning Society (IMLS)
Pages3131-3140
Number of pages10
Volume80
ISBN (Electronic)9781510867963
Publication statusPublished - 2018
Externally publishedYes
EventInternational Conference on Machine Learning 2018 - Stockholmsmässan, Stockholm, Sweden
Duration: 10 Jul 201815 Jul 2018
Conference number: 35th

Conference

ConferenceInternational Conference on Machine Learning 2018
Abbreviated titleICML 2018
Country/TerritorySweden
CityStockholm
Period10/07/1815/07/18

Cite this