Show simple item record

dc.contributor.authorDeng, Zeyu
dc.contributor.authorKammoun, Abla
dc.contributor.authorThrampoulidis, Christos
dc.date.accessioned2020-05-06T08:41:28Z
dc.date.available2020-05-06T08:41:28Z
dc.date.issued2020-04-09
dc.identifier.citationDeng, Z., Kammoun, A., & Thrampoulidis, C. (2020). A Model of Double Descent for High-Dimensional Logistic Regression. ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). doi:10.1109/icassp40776.2020.9053524
dc.identifier.isbn978-1-5090-6632-2
dc.identifier.issn1520-6149
dc.identifier.doi10.1109/ICASSP40776.2020.9053524
dc.identifier.urihttp://hdl.handle.net/10754/662742
dc.description.abstractWe consider a model for logistic regression where only a subset of features of size p is used for training a linear classifier over n training samples. The classifier is obtained by running gradient-descent (GD) on the logistic-loss. For this model, we investigate the dependence of the classification error on the overparameterization ratio κ = p/n. First, building on known deterministic results on convergence properties of the GD, we uncover a phase-transition phenomenon for the case of Gaussian features: the classification error of GD is the same as that of the maximum-likelihood (ML) solution when κ < κ⋆, and that of the max-margin (SVM) solution when κ > κ⋆. Next, using the convex Gaussian min-max theorem (CGMT), we sharply characterize the performance of both the ML and SVM solutions. Combining these results, we obtain curves that explicitly characterize the test error of GD for varying values of κ. The numerical results validate the theoretical predictions and unveil "double-descent" phenomena that complement similar recent observations in linear regression settings.
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.relation.urlhttps://ieeexplore.ieee.org/document/9053524/
dc.relation.urlhttps://ieeexplore.ieee.org/document/9053524/
dc.relation.urlhttps://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9053524
dc.rightsArchived with thanks to IEEE
dc.subjectGeneralization error
dc.subjectBinary Classification
dc.subjectOverparameterization
dc.subjectMax-margin
dc.subjectAsymptotics
dc.titleA Model of Double Descent for High-Dimensional Logistic Regression
dc.typeConference Paper
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.conference.date4-8 May 2020
dc.conference.nameICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
dc.conference.locationBarcelona, Spain
dc.eprint.versionPost-print
dc.contributor.institutionUniversity of California,Santa Barbara
kaust.personKammoun, Abla
dc.date.published-online2020-04-09
dc.date.published-print2020-05


This item appears in the following Collection(s)

Show simple item record