Joint Posterior Inference for Latent Gaussian Models and extended strategies using INLA
Name:
CristianChiuchioloThesisNEW.pdf
Size:
1.229Mb
Format:
PDF
Description:
PhD Dissertation revised
Type
DissertationAuthors
Chiuchiolo, Cristian
Advisors
Rue, Haavard
Committee members
Bolin, David
Jasra, Ajay

Wilson, Simon
Program
StatisticsDate
2022-06-06Permanent link to this record
http://hdl.handle.net/10754/679224
Metadata
Show full item recordAbstract
Bayesian inference is particularly challenging on hierarchical statistical models as computational complexity becomes a significant issue. Sampling-based methods like the popular Markov Chain Monte Carlo (MCMC) can provide accurate solutions, but they likely suffer a high computational burden. An attractive alternative is the Integrated Nested Laplace Approximations (INLA) approach, which is faster when applied to the broad class of Latent Gaussian Models (LGMs). The method computes fast and empirically accurate deterministic posterior marginal approximations of the model's unknown parameters. In the first part of this thesis, we discuss how to extend the software's applicability to a joint posterior inference by constructing a new class of joint posterior approximations, which also add marginal corrections for location and skewness. As these approximations result from a combination of a Gaussian Copula and internally pre-computed accurate Gaussian Approximations, we name this class Skew Gaussian Copula (SGC). By computing moments and correlation structure of a mixture representation of these distributions, we achieve new fast and accurate deterministic approximations for linear combinations in a subset of the model's latent field. The same mixture approximates a full joint posterior density through a Monte Carlo sampling on the hyperparameter set. We set highly skewed examples based on Poisson and Binomial hierarchical models and verify these new approximations using INLA and MCMC. The new skewness correction from the Skew Gaussian Copula is more consistent with the outcomes provided by the default INLA strategies. In the last part, we propose an extension of the parametric fit employed by the Simplified Laplace Approximation strategy in INLA when approximating posterior marginals. By default, the strategy matches log derivatives from a third-order Taylor expansion of each Laplace Approximation marginal with those derived from Skew Normal distributions. We consider a fourth-order term and adapt an Extended Skew Normal distribution to produce a more accurate approximation fit when skewness is large. We set similarly skewed data simulations with Poisson and Binomial likelihoods and show that the posterior marginal results from the new extended strategy are more accurate and coherent with the MCMC ones than its original version.Citation
Chiuchiolo, C. (2022). Joint Posterior Inference for Latent Gaussian Models and extended strategies using INLA. KAUST Research Repository. https://doi.org/10.25781/KAUST-W92MXae974a485f413a2113503eed53cd6c53
10.25781/KAUST-W92MX