Multi-index Monte Carlo: when sparsity meets sampling

Handle URI:
http://hdl.handle.net/10754/581768
Title:
Multi-index Monte Carlo: when sparsity meets sampling
Authors:
Haji Ali, Abdul Lateef ( 0000-0002-6243-0335 ) ; Nobile, Fabio; Tempone, Raul ( 0000-0003-1967-4446 )
Abstract:
We propose and analyze a novel multi-index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles’s seminal work, we use in MIMC high-order mixed differences instead of using first-order differences as in MLMC to reduce the variance of the hierarchical differences dramatically. This in turn yields new and improved complexity results, which are natural generalizations of Giles’s MLMC analysis and which increase the domain of the problem parameters for which we achieve the optimal convergence, O(TOL−2). Moreover, in MIMC, the rate of increase of required memory with respect to TOL is independent of the number of directions up to a logarithmic term which allows far more accurate solutions to be calculated for higher dimensions than what is possible when using MLMC. We motivate the setting of MIMC by first focusing on a simple full tensor index set. We then propose a systematic construction of optimal sets of indices for MIMC based on properly defined profits that in turn depend on the average cost per sample and the corresponding weak error and variance. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be the total degree type. In some cases, using optimal index sets, MIMC achieves a better rate for the computational complexity than the corresponding rate when using full tensor index sets. We also show the asymptotic normality of the statistical error in the resulting MIMC estimator and justify in this way our error estimate, which allows both the required accuracy and the confidence level in our computational results to be prescribed. Finally, we include numerical experiments involving a partial differential equation posed in three spatial dimensions and with random coefficients to substantiate the analysis and illustrate the corresponding computational savings of MIMC.
KAUST Department:
Applied Mathematics and Computational Science Program
Citation:
Multi-index Monte Carlo: when sparsity meets sampling 2015 Numerische Mathematik
Publisher:
Springer Science + Business Media
Journal:
Numerische Mathematik
Issue Date:
27-Jun-2015
DOI:
10.1007/s00211-015-0734-5
Type:
Article
ISSN:
0029-599X; 0945-3245
Additional Links:
http://link.springer.com/10.1007/s00211-015-0734-5
Appears in Collections:
Articles; Applied Mathematics and Computational Science Program

Full metadata record

DC FieldValue Language
dc.contributor.authorHaji Ali, Abdul Lateefen
dc.contributor.authorNobile, Fabioen
dc.contributor.authorTempone, Raulen
dc.date.accessioned2015-11-05T06:24:48Zen
dc.date.available2015-11-05T06:24:48Zen
dc.date.issued2015-06-27en
dc.identifier.citationMulti-index Monte Carlo: when sparsity meets sampling 2015 Numerische Mathematiken
dc.identifier.issn0029-599Xen
dc.identifier.issn0945-3245en
dc.identifier.doi10.1007/s00211-015-0734-5en
dc.identifier.urihttp://hdl.handle.net/10754/581768en
dc.description.abstractWe propose and analyze a novel multi-index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles’s seminal work, we use in MIMC high-order mixed differences instead of using first-order differences as in MLMC to reduce the variance of the hierarchical differences dramatically. This in turn yields new and improved complexity results, which are natural generalizations of Giles’s MLMC analysis and which increase the domain of the problem parameters for which we achieve the optimal convergence, O(TOL−2). Moreover, in MIMC, the rate of increase of required memory with respect to TOL is independent of the number of directions up to a logarithmic term which allows far more accurate solutions to be calculated for higher dimensions than what is possible when using MLMC. We motivate the setting of MIMC by first focusing on a simple full tensor index set. We then propose a systematic construction of optimal sets of indices for MIMC based on properly defined profits that in turn depend on the average cost per sample and the corresponding weak error and variance. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be the total degree type. In some cases, using optimal index sets, MIMC achieves a better rate for the computational complexity than the corresponding rate when using full tensor index sets. We also show the asymptotic normality of the statistical error in the resulting MIMC estimator and justify in this way our error estimate, which allows both the required accuracy and the confidence level in our computational results to be prescribed. Finally, we include numerical experiments involving a partial differential equation posed in three spatial dimensions and with random coefficients to substantiate the analysis and illustrate the corresponding computational savings of MIMC.en
dc.language.isoenen
dc.publisherSpringer Science + Business Mediaen
dc.relation.urlhttp://link.springer.com/10.1007/s00211-015-0734-5en
dc.rightsThe final publication is available at Springer via http://dx.doi.org/10.1007/s00211-015-0734-5en
dc.subject65C05en
dc.subject65N30en
dc.subject65N22en
dc.titleMulti-index Monte Carlo: when sparsity meets samplingen
dc.typeArticleen
dc.contributor.departmentApplied Mathematics and Computational Science Programen
dc.identifier.journalNumerische Mathematiken
dc.eprint.versionPost-printen
dc.contributor.institutionMATHICSE-CSQI, Ecole Polytechnique Fédérale de Lausanne, Lausanne, Switzerlanden
dc.contributor.affiliationKing Abdullah University of Science and Technology (KAUST)en
kaust.authorHaji Ali, Abdul Lateefen
kaust.authorTempone, Raulen
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.