Randomized Multilevel Monte Carlo for Embarrassingly Parallel Inference
dc.contributor.author | Jasra, Ajay | |
dc.contributor.author | Law, Kody J. H. | |
dc.contributor.author | Tarakanov, Alexander | |
dc.contributor.author | Yu, Fangyuan | |
dc.date.accessioned | 2022-05-25T05:24:11Z | |
dc.date.available | 2021-07-14T06:32:36Z | |
dc.date.available | 2022-05-25T05:24:11Z | |
dc.date.issued | 2022-03-10 | |
dc.identifier.citation | Jasra, A., Law, K. J. H., Tarakanov, A., & Yu, F. (2022). Randomized Multilevel Monte Carlo for Embarrassingly Parallel Inference. Communications in Computer and Information Science, 3–21. https://doi.org/10.1007/978-3-030-96498-6_1 | |
dc.identifier.isbn | 9783030964979 | |
dc.identifier.isbn | 9783030964986 | |
dc.identifier.issn | 1865-0929 | |
dc.identifier.issn | 1865-0937 | |
dc.identifier.doi | 10.1007/978-3-030-96498-6_1 | |
dc.identifier.uri | http://hdl.handle.net/10754/670196 | |
dc.description.abstract | This position paper summarizes a recently developed research program focused on inference in the context of data centric science and engineering applications, and forecasts its trajectory forward over the next decade. Often one endeavours in this context to learn complex systems in order to make more informed predictions and high stakes decisions under uncertainty. Some key challenges which must be met in this context are robustness, generalizability, and interpretability. The Bayesian framework addresses these three challenges, while bringing with it a fourth, undesirable feature: it is typically far more expensive than its deterministic counterparts. In the 21st century, and increasingly over the past decade, a growing number of methods have emerged which allow one to leverage cheap low-fidelity models in order to precondition algorithms for performing inference with more expensive models and make Bayesian inference tractable in the context of high-dimensional and expensive models. Notable examples are multilevel Monte Carlo (MLMC), multi-index Monte Carlo (MIMC), and their randomized counterparts (rMLMC), which are able to provably achieve a dimension-independent (including ∞- dimension) canonical complexity rate with respect to mean squared error (MSE) of 1/MSE. Some parallelizability is typically lost in an inference context, but recently this has been largely recovered via novel double randomization approaches. Such an approach delivers independent and identically distributed samples of quantities of interest which are unbiased with respect to the infinite resolution target distribution. Over the coming decade, this family of algorithms has the potential to transform data centric science and engineering, as well as classical machine learning applications such as deep learning, by scaling up and scaling out fully Bayesian inference. | |
dc.description.sponsorship | KJHL and AT were supported by The Alan Turing Institute under the EPSRC grant EP/N510129/1. AJ and FY acknowledge KAUST baseline support. | |
dc.publisher | Springer International Publishing | |
dc.relation.url | https://link.springer.com/10.1007/978-3-030-96498-6_1 | |
dc.rights | Archived with thanks to Springer International Publishing | |
dc.subject | Randomization Methods | |
dc.subject | Markov chain Monte Carlo | |
dc.subject | Bayesian Inference | |
dc.title | Randomized Multilevel Monte Carlo for Embarrassingly Parallel Inference | |
dc.type | Conference Paper | |
dc.contributor.department | Computer, Electrical and Mathematical Sciences and Engineering Division, King Abdullah University of Science and Technology, Thuwal, 23955, Kingdom of Saudi Arabia | |
dc.contributor.department | Computer, Electrical and Mathematical Science and Engineering (CEMSE) Division | |
dc.contributor.department | Statistics | |
dc.rights.embargodate | 2023-03-10 | |
dc.conference.date | 2021-10-18 to 2021-10-20 | |
dc.conference.name | 21st Smoky Mountains Computational Sciences and Engineering Conference, SMC 2021 | |
dc.conference.location | Virtual, Online | |
dc.eprint.version | Post-print | |
dc.contributor.institution | Department of Mathematics, University of Manchester, Manchester, M13 9PL, UK | |
dc.identifier.volume | 1512 CCIS | |
dc.identifier.pages | 3-21 | |
dc.identifier.arxivid | 2107.01913 | |
kaust.person | Jasra, Ajay | |
kaust.person | Yu, Fangyuan | |
dc.identifier.eid | 2-s2.0-85127041719 | |
refterms.dateFOA | 2021-07-14T06:33:00Z | |
kaust.acknowledged.supportUnit | Baseline support. |