Abstract
Identifying a low-dimensional informed parameter subspace offers a viable path to alleviating the dimensionality challenge in the sampled-based solution to large-scale Bayesian inverse problems. This paper introduces a novel gradient-based dimension reduction method in which the informed subspace does not depend on the data. This permits online-offline computational strategy where the expensive low-dimensional structure of the problem is detected in an offline phase, meaning before observing the data. This strategy is particularly relevant for multiple inversion problems as the same informed subspace can be reused. The proposed approach allows to control the approximation error (in expectation over the data) of the posterior distribution. We also present sampling strategies which exploit the informed subspace to draw efficiently samples from the exact posterior distribution. The method is successfully illustrated on two numerical examples: a PDE-based inverse problem with a Gaussian process prior and a tomography problem with Poisson data and a Besov-B112prior.
Original language | English |
---|---|
Article number | 045009 |
Number of pages | 41 |
Journal | Inverse Problems |
Volume | 37 |
Issue number | 4 |
DOIs | |
Publication status | Published - Apr 2021 |
Keywords
- Bayesian inference
- data-free informed subspace
- dimension reduction
- subspace MCMC