A Model of Double Descent for High-Dimensional Logistic Regression
Type
Conference PaperDate
2020-04-09Online Publication Date
2020-04-09Print Publication Date
2020-05Permanent link to this record
http://hdl.handle.net/10754/662742
Metadata
Show full item recordAbstract
We consider a model for logistic regression where only a subset of features of size p is used for training a linear classifier over n training samples. The classifier is obtained by running gradient-descent (GD) on the logistic-loss. For this model, we investigate the dependence of the classification error on the overparameterization ratio κ = p/n. First, building on known deterministic results on convergence properties of the GD, we uncover a phase-transition phenomenon for the case of Gaussian features: the classification error of GD is the same as that of the maximum-likelihood (ML) solution when κ < κ⋆, and that of the max-margin (SVM) solution when κ > κ⋆. Next, using the convex Gaussian min-max theorem (CGMT), we sharply characterize the performance of both the ML and SVM solutions. Combining these results, we obtain curves that explicitly characterize the test error of GD for varying values of κ. The numerical results validate the theoretical predictions and unveil "double-descent" phenomena that complement similar recent observations in linear regression settings.Citation
Deng, Z., Kammoun, A., & Thrampoulidis, C. (2020). A Model of Double Descent for High-Dimensional Logistic Regression. ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). doi:10.1109/icassp40776.2020.9053524Conference/Event name
ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)ISBN
978-1-5090-6632-2Additional Links
https://ieeexplore.ieee.org/document/9053524/https://ieeexplore.ieee.org/document/9053524/
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9053524
ae974a485f413a2113503eed53cd6c53
10.1109/ICASSP40776.2020.9053524