Show simple item record

dc.contributor.authorQin, Yipeng
dc.contributor.authorMitra, Niloy
dc.contributor.authorWonka, Peter
dc.date.accessioned2020-11-04T13:14:26Z
dc.date.available2020-11-04T13:14:26Z
dc.date.issued2020-10-09
dc.identifier.citationQin, Y., Mitra, N., & Wonka, P. (2020). How Does Lipschitz Regularization Influence GAN Training? Lecture Notes in Computer Science, 310–326. doi:10.1007/978-3-030-58517-4_19
dc.identifier.isbn9783030585167
dc.identifier.issn1611-3349
dc.identifier.issn0302-9743
dc.identifier.doi10.1007/978-3-030-58517-4_19
dc.identifier.urihttp://hdl.handle.net/10754/665819
dc.description.abstractDespite the success of Lipschitz regularization in stabilizing GAN training, the exact reason of its effectiveness remains poorly understood. The direct effect of K-Lipschitz regularization is to restrict the L2-norm of the neural network gradient to be smaller than a threshold K (e.g.,) such that. In this work, we uncover an even more important effect of Lipschitz regularization by examining its impact on the loss function: It degenerates GAN loss functions to almost linear ones by restricting their domain and interval of attainable gradient values. Our analysis shows that loss functions are only successful if they are degenerated to almost linear ones. We also show that loss functions perform poorly if they are not degenerated and that a wide range of functions can be used as loss function as long as they are sufficiently degenerated by regularization. Basically, Lipschitz regularization ensures that all loss functions effectively work in the same way. Empirically, we verify our proposition on the MNIST, CIFAR10 and CelebA datasets.
dc.description.sponsorshipThis work was supported in part by the KAUST Office of Sponsored Research (OSR) under Award No. OSR-CRG2018-3730.
dc.publisherSpringer Nature
dc.relation.urlhttp://link.springer.com/10.1007/978-3-030-58517-4_19
dc.relation.urlhttp://orca.cf.ac.uk/133740/1/how_does_lipschitz_regularization_influence_gan_training.pdf
dc.rightsArchived with thanks to Springer International Publishing
dc.rightsThis file is an open access version redistributed from: http://orca.cf.ac.uk/133740/1/how_does_lipschitz_regularization_influence_gan_training.pdf
dc.titleHow Does Lipschitz Regularization Influence GAN Training?
dc.typeConference Paper
dc.contributor.departmentVisual Computing Center (VCC)
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.contributor.departmentComputer Science Program
dc.rights.embargodate2021-10-09
dc.conference.date2020-08-23 to 2020-08-28
dc.conference.name16th European Conference on Computer Vision, ECCV 2020
dc.conference.locationGlasgow, GBR
dc.eprint.versionPost-print
dc.contributor.institutionCardiff University, Cardiff, UK
dc.contributor.institutionUCL/Adobe Research, London, UK
dc.identifier.volume12361 LNCS
dc.identifier.pages310-326
dc.identifier.arxivid1811.09567
kaust.personQin, Yipeng
kaust.personWonka, Peter
kaust.grant.numberCRG2018-3730
dc.identifier.eid2-s2.0-85092937698
refterms.dateFOA2020-12-08T07:41:55Z
kaust.acknowledged.supportUnitKAUST Office of Sponsored Research (OSR)
dc.date.published-online2020-10-10
dc.date.published-print2020


Files in this item

Thumbnail
Name:
Conference Paperfile1.pdf
Size:
979.6Kb
Format:
PDF
Description:
Post-print
Embargo End Date:
2021-10-09

This item appears in the following Collection(s)

Show simple item record