HLIBCov: Parallel Hierarchical Matrix Approximation of Large Covariance Matrices and Likelihoods with Applications in Parameter Identification

Handle URI:
http://hdl.handle.net/10754/625507
Title:
HLIBCov: Parallel Hierarchical Matrix Approximation of Large Covariance Matrices and Likelihoods with Applications in Parameter Identification
Authors:
Litvinenko, Alexander ( 0000-0001-5427-3598 )
Abstract:
The main goal of this article is to introduce the parallel hierarchical matrix library HLIBpro to the statistical community. We describe the HLIBCov package, which is an extension of the HLIBpro library for approximating large covariance matrices and maximizing likelihood functions. We show that an approximate Cholesky factorization of a dense matrix of size $2M\times 2M$ can be computed on a modern multi-core desktop in few minutes. Further, HLIBCov is used for estimating the unknown parameters such as the covariance length, variance and smoothness parameter of a Mat\'ern covariance function by maximizing the joint Gaussian log-likelihood function. The computational bottleneck here is expensive linear algebra arithmetics due to large and dense covariance matrices. Therefore covariance matrices are approximated in the hierarchical ($\H$-) matrix format with computational cost $\mathcal{O}(k^2n \log^2 n/p)$ and storage $\mathcal{O}(kn \log n)$, where the rank $k$ is a small integer (typically $k<25$), $p$ the number of cores and $n$ the number of locations on a fairly general mesh. We demonstrate a synthetic example, where the true values of known parameters are known. For reproducibility we provide the C++ code, the documentation, and the synthetic data.
KAUST Department:
Bayesian Computational Statistics & Modeling, CEMSE
Issue Date:
26-Sep-2017
Type:
Technical Report
Sponsors:
KAUST
Appears in Collections:
Technical Reports

Full metadata record

DC FieldValue Language
dc.contributor.authorLitvinenko, Alexanderen
dc.date.accessioned2017-09-26T05:48:20Z-
dc.date.available2017-09-26T05:48:20Z-
dc.date.issued2017-09-26-
dc.identifier.urihttp://hdl.handle.net/10754/625507-
dc.description.abstractThe main goal of this article is to introduce the parallel hierarchical matrix library HLIBpro to the statistical community. We describe the HLIBCov package, which is an extension of the HLIBpro library for approximating large covariance matrices and maximizing likelihood functions. We show that an approximate Cholesky factorization of a dense matrix of size $2M\times 2M$ can be computed on a modern multi-core desktop in few minutes. Further, HLIBCov is used for estimating the unknown parameters such as the covariance length, variance and smoothness parameter of a Mat\'ern covariance function by maximizing the joint Gaussian log-likelihood function. The computational bottleneck here is expensive linear algebra arithmetics due to large and dense covariance matrices. Therefore covariance matrices are approximated in the hierarchical ($\H$-) matrix format with computational cost $\mathcal{O}(k^2n \log^2 n/p)$ and storage $\mathcal{O}(kn \log n)$, where the rank $k$ is a small integer (typically $k<25$), $p$ the number of cores and $n$ the number of locations on a fairly general mesh. We demonstrate a synthetic example, where the true values of known parameters are known. For reproducibility we provide the C++ code, the documentation, and the synthetic data.en
dc.description.sponsorshipKAUSTen
dc.subjectparallel hierarchical matricesen
dc.subjectMatern covarianceen
dc.subjectlarge data setsen
dc.subjectlow-rank approximatiinen
dc.subjectparameter identificationen
dc.subjectSpatial statisticsen
dc.titleHLIBCov: Parallel Hierarchical Matrix Approximation of Large Covariance Matrices and Likelihoods with Applications in Parameter Identificationen
dc.typeTechnical Reporten
dc.contributor.departmentBayesian Computational Statistics & Modeling, CEMSEen
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.