Certified dimension reduction in nonlinear Bayesian inverse problems

Olivier Zahm, Tiangang Cui, Kody Law, Alessio Spantini, Youssef Marzouk

Research output: Contribution to journalArticleResearchpeer-review

20 Citations (Scopus)

Abstract

We propose a dimension reduction technique for Bayesian inverse problems with nonlinear forward operators, non-Gaussian priors, and non-Gaussian observation noise. The likelihood function is approximated by a ridge function, i.e., a map which depends nontrivially only on a few linear combinations of the parameters. We build this ridge approximation by minimizing an upper bound on the Kullback–Leibler divergence between the posterior distribution and its approximation. This bound, obtained via logarithmic Sobolev inequalities, allows one to certify the error of the posterior approximation. Computing the bound requires computing the second moment matrix of the gradient of the log-likelihood function. In practice, a sample-based approximation of the upper bound is then required. We provide an analysis that enables control of the posterior approximation error due to this sampling. Numerical and theoretical comparisons with existing methods illustrate the benefits of the proposed methodology.

Original languageEnglish
Pages (from-to)1789-1835
Number of pages47
JournalMathematics of Computation
Volume91
DOIs
Publication statusPublished - 2022

Cite this