Adaptive independence samplers

Jonathan Keith, Dirk Kroese, George Sofronov

Research output: Contribution to journalArticleResearchpeer-review

22 Citations (Scopus)

Abstract

Markov chain Monte Carlo (MCMC) is an important computational technique for generating samples from non-standard probability distributions. A major challenge in the design of practical MCMC samplers is to achieve efficient convergence and mixing properties. One way to accelerate convergence and mixing is to adapt the proposal distribution in light of previously sampled points, thus increasing the probability of acceptance. In this paper, we propose two new adaptive MCMC algorithms based on the Independent Metropolisa??Hastings algorithm. In the first, we adjust the proposal to minimize an estimate of the cross-entropy between the target and proposal distributions, using the experience of pre-runs. This approach provides a general technique for deriving natural adaptive formulae. The second approach uses multiple parallel chains, and involves updating chains individually, then updating a proposal density by fitting a Bayesian model to the population. An important feature of this approach is that adapting the proposal does not change the limiting distributions of the chains. Consequently, the adaptive phase of the sampler can be continued indefinitely. We include results of numerical experiments indicating that the new algorithms compete well with traditional Metropolisa??Hastings algorithms. We also demonstrate the method for a realistic problem arising in Comparative Genomics.
Original languageEnglish
Pages (from-to)409 - 420
Number of pages12
JournalStatistics and Computing
Volume18
Issue number4
DOIs
Publication statusPublished - 2008
Externally publishedYes

Cite this