Likelihood-informed dimension reduction for nonlinear inverse problems

T. Cui, J. Martin, Y. M. Marzouk, A. Solonen, A. Spantini

Research output: Contribution to journalArticleResearchpeer-review

92 Citations (Scopus)


The intrinsic dimensionality of an inverse problem is affected by prior information, the accuracy and number of observations, and the smoothing properties of the forward operator. From a Bayesian perspective, changes from the prior to the posterior may, in many problems, be confined to a relatively low-dimensional subspace of the parameter space. We present a dimension reduction approach that defines and identifies such a subspace, called the 'likelihood-informed subspace' (LIS), by characterizing the relative influences of the prior and the likelihood over the support of the posterior distribution. This identification enables new and more efficient computational methods for Bayesian inference with nonlinear forward models and Gaussian priors. In particular, we approximate the posterior distribution as the product of a lower-dimensional posterior defined on the LIS and the prior distribution marginalized onto the complementary subspace. Markov chain Monte Carlo sampling can then proceed in lower dimensions, with significant gains in computational efficiency. We also introduce a Rao-Blackwellization strategy that de-randomizes Monte Carlo estimates of posterior expectations for additional variance reduction. We demonstrate the efficiency of our methods using two numerical examples: inference of permeability in a groundwater system governed by an elliptic PDE, and an atmospheric remote sensing problem based on Global Ozone Monitoring System (GOMOS) observations.

Original languageEnglish
Article number114015
Number of pages28
JournalInverse Problems
Issue number11
Publication statusPublished - 28 Oct 2014
Externally publishedYes


  • Bayesian inference
  • dimension reduction
  • inverse problem
  • low-rank approximation
  • Markov chain Monte Carlo
  • variance reduction

Cite this