Show simple item record

dc.contributor.authorQian, Xun
dc.contributor.authorRichtarik, Peter
dc.contributor.authorZhang, Tong
dc.date.accessioned2020-10-06T12:11:09Z
dc.date.available2020-10-06T12:11:09Z
dc.date.issued2020-09-30
dc.identifier.urihttp://hdl.handle.net/10754/665461
dc.description.abstractGradient compression is a recent and increasingly popular technique for reducing the communication cost in distributed training of large-scale machine learning models. In this work we focus on developing efficient distributed methods that can work for any compressor satisfying a certain contraction property, which includes both unbiased (after appropriate scaling) and biased compressors such as RandK and TopK. Applied naively, gradient compression introduces errors that either slow down convergence or lead to divergence. A popular technique designed to tackle this issue is error compensation/error feedback. Due to the difficulties associated with analyzing biased compressors, it is not known whether gradient compression with error compensation can be combined with Nesterov's acceleration. In this work, we show for the first time that error compensated gradient compression methods can be accelerated. In particular, we propose and study the error compensated loopless Katyusha method, and establish an accelerated linear convergence rate under standard assumptions. We show through numerical experiments that the proposed method converges with substantially fewer communication rounds than previous error compensated algorithms.
dc.publisherarXiv
dc.relation.urlhttps://arxiv.org/pdf/2010.00091
dc.rightsArchived with thanks to arXiv
dc.titleError Compensated Distributed SGD Can Be Accelerated
dc.typePreprint
dc.contributor.departmentVisual Computing Center (VCC)
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.contributor.departmentComputer Science Program
dc.eprint.versionPre-print
dc.contributor.institutionHong Kong University of Science and Technology, Hong Kong.
dc.identifier.arxivid2010.00091
kaust.personQian, Xun
kaust.personRichtarik, Peter
refterms.dateFOA2020-10-06T12:11:37Z


Files in this item

Thumbnail
Name:
Preprintfile1.pdf
Size:
831.8Kb
Format:
PDF
Description:
Pre-print

This item appears in the following Collection(s)

Show simple item record