Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection

Handle URI:
http://hdl.handle.net/10754/599681
Title:
Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection
Authors:
Chen, Lisha; Huang, Jianhua Z.
Abstract:
The reduced-rank regression is an effective method in predicting multiple response variables from the same set of predictor variables. It reduces the number of model parameters and takes advantage of interrelations between the response variables and hence improves predictive accuracy. We propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty. We apply a group-lasso type penalty that treats each row of the matrix of the regression coefficients as a group and show that this penalty satisfies certain desirable invariance properties. We develop two numerical algorithms to solve the penalized regression problem and establish the asymptotic consistency of the proposed method. In particular, the manifold structure of the reduced-rank regression coefficient matrix is considered and studied in our theoretical analysis. In our simulation study and real data analysis, the new method is compared with several existing variable selection methods for multivariate regression and exhibits competitive performance in prediction and variable selection. © 2012 American Statistical Association.
Citation:
Chen L, Huang JZ (2012) Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection. Journal of the American Statistical Association 107: 1533–1545. Available: http://dx.doi.org/10.1080/01621459.2012.734178.
Publisher:
Informa UK Limited
Journal:
Journal of the American Statistical Association
KAUST Grant Number:
KUS-CI-016-04
Issue Date:
Dec-2012
DOI:
10.1080/01621459.2012.734178
Type:
Article
ISSN:
0162-1459; 1537-274X
Sponsors:
Lisha Chen is Assistant Professor, Department of Statistics, Yale University, New Haven, CT 06511 (E-mail: lisha.chen@yale.edu). Jianhua Z. Huang is Professor, Department of Statistics, Texas A&M University, College Station, TX 77843-3143 (E-mail: jianhua@stat.tamu.edu). Huang's work was partially supported by grants from the National Science Foundation (NSF; DMS-0907170, DMS-1007618, DMS-1208952) and Award Number KUS-CI-016-04, made by King Abdullah University of Science and Technology (KAUST). The authors thank Joseph Chang, Dean Foster, Zhihua Qiao, Lyle Ungar, and Lan Zhou for helpful discussions. They also thank two anonymous reviewers, an associate editor and the co-editor Jun Liu for constructive comments.
Appears in Collections:
Publications Acknowledging KAUST Support

Full metadata record

DC FieldValue Language
dc.contributor.authorChen, Lishaen
dc.contributor.authorHuang, Jianhua Z.en
dc.date.accessioned2016-02-28T06:07:24Zen
dc.date.available2016-02-28T06:07:24Zen
dc.date.issued2012-12en
dc.identifier.citationChen L, Huang JZ (2012) Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection. Journal of the American Statistical Association 107: 1533–1545. Available: http://dx.doi.org/10.1080/01621459.2012.734178.en
dc.identifier.issn0162-1459en
dc.identifier.issn1537-274Xen
dc.identifier.doi10.1080/01621459.2012.734178en
dc.identifier.urihttp://hdl.handle.net/10754/599681en
dc.description.abstractThe reduced-rank regression is an effective method in predicting multiple response variables from the same set of predictor variables. It reduces the number of model parameters and takes advantage of interrelations between the response variables and hence improves predictive accuracy. We propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty. We apply a group-lasso type penalty that treats each row of the matrix of the regression coefficients as a group and show that this penalty satisfies certain desirable invariance properties. We develop two numerical algorithms to solve the penalized regression problem and establish the asymptotic consistency of the proposed method. In particular, the manifold structure of the reduced-rank regression coefficient matrix is considered and studied in our theoretical analysis. In our simulation study and real data analysis, the new method is compared with several existing variable selection methods for multivariate regression and exhibits competitive performance in prediction and variable selection. © 2012 American Statistical Association.en
dc.description.sponsorshipLisha Chen is Assistant Professor, Department of Statistics, Yale University, New Haven, CT 06511 (E-mail: lisha.chen@yale.edu). Jianhua Z. Huang is Professor, Department of Statistics, Texas A&M University, College Station, TX 77843-3143 (E-mail: jianhua@stat.tamu.edu). Huang's work was partially supported by grants from the National Science Foundation (NSF; DMS-0907170, DMS-1007618, DMS-1208952) and Award Number KUS-CI-016-04, made by King Abdullah University of Science and Technology (KAUST). The authors thank Joseph Chang, Dean Foster, Zhihua Qiao, Lyle Ungar, and Lan Zhou for helpful discussions. They also thank two anonymous reviewers, an associate editor and the co-editor Jun Liu for constructive comments.en
dc.publisherInforma UK Limiteden
dc.subjectGroup lasso penaltyen
dc.subjectLow rank matrix approximationen
dc.subjectMultivariate regressionen
dc.subjectPenalized least squaresen
dc.subjectSparsityen
dc.subjectStiefel manifolden
dc.titleSparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selectionen
dc.typeArticleen
dc.identifier.journalJournal of the American Statistical Associationen
dc.contributor.institutionYale University, New Haven, United Statesen
dc.contributor.institutionTexas A and M University, College Station, United Statesen
kaust.grant.numberKUS-CI-016-04en
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.