Show simple item record

dc.contributor.authorHanzely, Filip
dc.contributor.authorRichtarik, Peter
dc.contributor.authorXiao, Lin
dc.date.accessioned2021-04-21T07:56:17Z
dc.date.available2019-05-29T07:28:01Z
dc.date.available2021-04-21T07:56:17Z
dc.date.issued2021-04-07
dc.date.submitted2020-04-24
dc.identifier.citationHanzely, F., Richtárik, P., & Xiao, L. (2021). Accelerated Bregman proximal gradient methods for relatively smooth convex optimization. Computational Optimization and Applications. doi:10.1007/s10589-021-00273-8
dc.identifier.issn1573-2894
dc.identifier.issn0926-6003
dc.identifier.doi10.1007/s10589-021-00273-8
dc.identifier.urihttp://hdl.handle.net/10754/653113
dc.description.abstractWe consider the problem of minimizing the sum of two convex functions: one is differentiable and relatively smooth with respect to a reference convex function, and the other can be nondifferentiable but simple to optimize. We investigate a triangle scaling property of the Bregman distance generated by the reference convex function and present accelerated Bregman proximal gradient (ABPG) methods that attain an O(k-γ) convergence rate, where γ∈ (0 , 2] is the triangle scaling exponent (TSE) of the Bregman distance. For the Euclidean distance, we have γ= 2 and recover the convergence rate of Nesterov’s accelerated gradient methods. For non-Euclidean Bregman distances, the TSE can be much smaller (say γ≤ 1), but we show that a relaxed definition of intrinsic TSE is always equal to 2. We exploit the intrinsic TSE to develop adaptive ABPG methods that converge much faster in practice. Although theoretical guarantees on a fast convergence rate seem to be out of reach in general, our methods obtain empirical O(k- 2) rates in numerical experiments on several applications and provide posterior numerical certificates for the fast rates.
dc.description.sponsorshipWe thank Haihao Lu, Robert Freund and Yurii Nesterov for helpful conversations. We are also grateful to the anonymous referees whose comments helped improving the clarity of the paper. Peter Richtárik acknowledges the support of the KAUST Baseline Research Funding Scheme.
dc.publisherSpringer Science and Business Media LLC
dc.relation.urlhttp://link.springer.com/10.1007/s10589-021-00273-8
dc.rightsArchived with thanks to Computational Optimization and Applications
dc.titleAccelerated Bregman proximal gradient methods for relatively smooth convex optimization
dc.typeArticle
dc.contributor.departmentApplied Mathematics and Computational Science Program
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.contributor.departmentComputer Science Program
dc.identifier.journalComputational Optimization and Applications
dc.rights.embargodate2022-04-07
dc.eprint.versionPost-print
dc.contributor.institutionToyota Technological Institute at Chicago (TTIC), Chicago, Illinois, USA
dc.contributor.institutionMoscow Institute of Physics and Technology, Dolgoprudny, Russia
dc.contributor.institutionMicrosoft Research, Redmond, WA, USA
dc.identifier.arxivid1808.03045
kaust.personHanzely, Filip
kaust.personRichtarik, Peter
dc.versionv1
dc.identifier.eid2-s2.0-85103928357
refterms.dateFOA2019-05-29T07:28:12Z
kaust.acknowledged.supportUnitKAUST Baseline Research Funding Scheme


Files in this item

Thumbnail
Name:
1808.03045.pdf
Size:
1.263Mb
Format:
PDF
Description:
Preprint
Embargo End Date:
2022-04-07

This item appears in the following Collection(s)

Show simple item record

VersionItemEditorDateSummary

*Selected version