Asymptotic optimality and efficient computation of the leave-subject-out cross-validation
Type
ArticleAuthors
Xu, GanggangHuang, Jianhua Z.
KAUST Grant Number
KUS-CI-016-04Date
2012-12Permanent link to this record
http://hdl.handle.net/10754/597623
Metadata
Show full item recordAbstract
Although the leave-subject-out cross-validation (CV) has been widely used in practice for tuning parameter selection for various nonparametric and semiparametric models of longitudinal data, its theoretical property is unknown and solving the associated optimization problem is computationally expensive, especially when there are multiple tuning parameters. In this paper, by focusing on the penalized spline method, we show that the leave-subject-out CV is optimal in the sense that it is asymptotically equivalent to the empirical squared error loss function minimization. An efficient Newton-type algorithm is developed to compute the penalty parameters that optimize the CV criterion. Simulated and real data are used to demonstrate the effectiveness of the leave-subject-out CV in selecting both the penalty parameters and the working correlation matrix. © 2012 Institute of Mathematical Statistics.Citation
Xu G, Huang JZ (2012) Asymptotic optimality and efficient computation of the leave-subject-out cross-validation. The Annals of Statistics 40: 3003–3030. Available: http://dx.doi.org/10.1214/12-AOS1063.Sponsors
Supported in part by Award Number KUS-CI-016-04, made by King Abdullah University of Science and Technology (KAUST).Supported in part by NSF Grants DMS-09-07170, DMS-10-07618, DMS-12-08952 and NCI Grant CA57030.Publisher
Institute of Mathematical StatisticsJournal
The Annals of Statisticsae974a485f413a2113503eed53cd6c53
10.1214/12-AOS1063