Data-free likelihood-informed dimension reduction of Bayesian inverse problems

Tiangang Cui, Olivier Zahm

Research output: Contribution to journalArticleResearchpeer-review

14 Citations (Scopus)

Abstract

Identifying a low-dimensional informed parameter subspace offers a viable path to alleviating the dimensionality challenge in the sampled-based solution to large-scale Bayesian inverse problems. This paper introduces a novel gradient-based dimension reduction method in which the informed subspace does not depend on the data. This permits online-offline computational strategy where the expensive low-dimensional structure of the problem is detected in an offline phase, meaning before observing the data. This strategy is particularly relevant for multiple inversion problems as the same informed subspace can be reused. The proposed approach allows to control the approximation error (in expectation over the data) of the posterior distribution. We also present sampling strategies which exploit the informed subspace to draw efficiently samples from the exact posterior distribution. The method is successfully illustrated on two numerical examples: a PDE-based inverse problem with a Gaussian process prior and a tomography problem with Poisson data and a Besov-B112prior.

Original languageEnglish
Article number045009
Number of pages41
JournalInverse Problems
Volume37
Issue number4
DOIs
Publication statusPublished - Apr 2021

Keywords

  • Bayesian inference
  • data-free informed subspace
  • dimension reduction
  • subspace MCMC

Cite this