Name:
1-s2.0-S0021999115006701-main.pdf
Size:
1.441Mb
Format:
PDF
Description:
Accepted Manuscript
Type
ArticleDate
2015-10-19Online Publication Date
2015-10-19Print Publication Date
2016-01Permanent link to this record
http://hdl.handle.net/10754/581312
Metadata
Show full item recordAbstract
Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.Citation
Dimension-independent likelihood-informed MCMC 2015 Journal of Computational PhysicsPublisher
Elsevier BVJournal
Journal of Computational PhysicsarXiv
1411.3688Additional Links
http://linkinghub.elsevier.com/retrieve/pii/S0021999115006701ae974a485f413a2113503eed53cd6c53
10.1016/j.jcp.2015.10.008