• Login
    View Item 
    •   Home
    • Research
    • Preprints
    • View Item
    •   Home
    • Research
    • Preprints
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of KAUSTCommunitiesIssue DateSubmit DateThis CollectionIssue DateSubmit Date

    My Account

    Login

    Quick Links

    Open Access PolicyORCID LibguideTheses and Dissertations LibguideSubmit an Item

    Statistics

    Display statistics

    SGD and Hogwild! Convergence Without the Bounded Gradients Assumption

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    1802.03801.pdf
    Size:
    1.156Mb
    Format:
    PDF
    Description:
    Preprint
    Download
    Type
    Preprint
    Authors
    Nguyen, Lam M.
    Nguyen, Phuong Ha
    Dijk, Marten van
    Richtarik, Peter cc
    Scheinberg, Katya
    Takáč, Martin
    KAUST Department
    Computer Science
    Computer Science Program
    Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
    Date
    2018-02-11
    Permanent link to this record
    http://hdl.handle.net/10754/653111
    
    Metadata
    Show full item record
    Abstract
    Stochastic gradient descent (SGD) is the optimization algorithm of choice inmany machine learning applications such as regularized empirical riskminimization and training deep neural networks. The classical convergenceanalysis of SGD is carried out under the assumption that the norm of thestochastic gradient is uniformly bounded. While this might hold for some lossfunctions, it is always violated for cases where the objective function isstrongly convex. In (Bottou et al.,2016), a new analysis of convergence of SGDis performed under the assumption that stochastic gradients are bounded withrespect to the true gradient norm. Here we show that for stochastic problemsarising in machine learning such bound always holds; and we also propose analternative convergence analysis of SGD with diminishing learning rate regime,which results in more relaxed conditions than those in (Bottou et al.,2016). Wethen move on the asynchronous parallel setting, and prove convergence ofHogwild! algorithm in the same regime, obtaining the first convergence resultsfor this method in the case of diminished learning rate.
    Citation
    Proceedings of the 35th International Conference on Machine \n Learning, PMLR 80:3747-3755, 2018
    Sponsors
    The authors would like to thank the reviewers for useful suggestions which helped to improve the exposition in the paper. The authors also would like to thank Francesco Orabona for his valuable comments and suggestions. \nLam M. Nguyen was partially supported by NSF Grants CCF 16-18717. Phuong Ha Nguyen and Marten van Dijk were supported in part by AFOSR MURI under award number FA9550-14-1-0351. Katya Scheinberg was partially supported by NSF Grants CCF 16-18717 and CCF 17-40796. Martin Takac was supported by U.S. National Science Foundation, under award number NSF:CCF:1618717, NSF:CMMI:1663256 and NSF:CCF:1740796.
    Publisher
    arXiv
    arXiv
    1802.03801
    Additional Links
    https://arxiv.org/pdf/1802.03801
    Collections
    Preprints; Computer Science Program; Computer, Electrical and Mathematical Science and Engineering (CEMSE) Division

    entitlement

     
    DSpace software copyright © 2002-2022  DuraSpace
    Quick Guide | Contact Us | KAUST University Library
    Open Repository is a service hosted by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items. For anonymous users the allowed maximum amount is 50 search results.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.