Show simple item record

dc.contributor.authorKhaled, Ahmed
dc.contributor.authorSebbouh, Othmane
dc.contributor.authorLoizou, Nicolas
dc.contributor.authorGower, Robert M.
dc.contributor.authorRichtarik, Peter
dc.date.accessioned2020-06-29T07:42:35Z
dc.date.available2020-06-29T07:42:35Z
dc.date.issued2020-06-20
dc.identifier.urihttp://hdl.handle.net/10754/663909
dc.description.abstractWe present a unified theorem for the convergence analysis of stochastic gradient algorithms for minimizing a smooth and convex loss plus a convex regularizer. We do this by extending the unified analysis of Gorbunov, Hanzely \& Richt\'arik (2020) and dropping the requirement that the loss function be strongly convex. Instead, we only rely on convexity of the loss function. Our unified analysis applies to a host of existing algorithms such as proximal SGD, variance reduced methods, quantization and some coordinate descent type methods. For the variance reduced methods, we recover the best known convergence rates as special cases. For proximal SGD, the quantization and coordinate type methods, we uncover new state-of-the-art convergence rates. Our analysis also includes any form of sampling and minibatching. As such, we are able to determine the minibatch size that optimizes the total complexity of variance reduced methods. We showcase this by obtaining a simple formula for the optimal minibatch size of two variance reduced methods (\textit{L-SVRG} and \textit{SAGA}). This optimal minibatch size not only improves the theoretical total complexity of the methods but also improves their convergence in practice, as we show in several experiments.
dc.publisherarXiv
dc.relation.urlhttps://arxiv.org/pdf/2006.11573
dc.rightsArchived with thanks to arXiv
dc.titleUnified Analysis of Stochastic Gradient Methods for Composite Convex and Smooth Optimization
dc.typePreprint
dc.contributor.departmentComputer Science Program
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.eprint.versionPre-print
dc.contributor.institutionCairo University, Giza, Egypt.
dc.contributor.institutionMila, Universit´e de Montr´eal, Montr´eal, Canada.
dc.contributor.institutionFacebook AI Research, New York, USA.
dc.identifier.arxivid2006.11573
kaust.personKhaled, Ahmed
kaust.personSebbouh, Othmane
kaust.personRichtarik, Peter
refterms.dateFOA2020-06-29T07:43:03Z


Files in this item

Thumbnail
Name:
Preprintfile1.pdf
Size:
719.0Kb
Format:
PDF
Description:
Pre-print

This item appears in the following Collection(s)

Show simple item record