How Does Lipschitz Regularization Influence GAN Training?
dc.contributor.author | Qin, Yipeng | |
dc.contributor.author | Mitra, Niloy | |
dc.contributor.author | Wonka, Peter | |
dc.date.accessioned | 2020-11-04T13:14:26Z | |
dc.date.available | 2020-11-04T13:14:26Z | |
dc.date.issued | 2020-10-09 | |
dc.identifier.citation | Qin, Y., Mitra, N., & Wonka, P. (2020). How Does Lipschitz Regularization Influence GAN Training? Lecture Notes in Computer Science, 310–326. doi:10.1007/978-3-030-58517-4_19 | |
dc.identifier.isbn | 9783030585167 | |
dc.identifier.issn | 1611-3349 | |
dc.identifier.issn | 0302-9743 | |
dc.identifier.doi | 10.1007/978-3-030-58517-4_19 | |
dc.identifier.uri | http://hdl.handle.net/10754/665819 | |
dc.description.abstract | Despite the success of Lipschitz regularization in stabilizing GAN training, the exact reason of its effectiveness remains poorly understood. The direct effect of K-Lipschitz regularization is to restrict the L2-norm of the neural network gradient to be smaller than a threshold K (e.g.,) such that. In this work, we uncover an even more important effect of Lipschitz regularization by examining its impact on the loss function: It degenerates GAN loss functions to almost linear ones by restricting their domain and interval of attainable gradient values. Our analysis shows that loss functions are only successful if they are degenerated to almost linear ones. We also show that loss functions perform poorly if they are not degenerated and that a wide range of functions can be used as loss function as long as they are sufficiently degenerated by regularization. Basically, Lipschitz regularization ensures that all loss functions effectively work in the same way. Empirically, we verify our proposition on the MNIST, CIFAR10 and CelebA datasets. | |
dc.description.sponsorship | This work was supported in part by the KAUST Office of Sponsored Research (OSR) under Award No. OSR-CRG2018-3730. | |
dc.publisher | Springer Nature | |
dc.relation.url | http://link.springer.com/10.1007/978-3-030-58517-4_19 | |
dc.relation.url | http://orca.cf.ac.uk/133740/1/how_does_lipschitz_regularization_influence_gan_training.pdf | |
dc.rights | Archived with thanks to Springer International Publishing | |
dc.rights | This file is an open access version redistributed from: http://orca.cf.ac.uk/133740/1/how_does_lipschitz_regularization_influence_gan_training.pdf | |
dc.title | How Does Lipschitz Regularization Influence GAN Training? | |
dc.type | Conference Paper | |
dc.contributor.department | Visual Computing Center (VCC) | |
dc.contributor.department | Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division | |
dc.contributor.department | Computer Science Program | |
dc.rights.embargodate | 2021-10-09 | |
dc.conference.date | 2020-08-23 to 2020-08-28 | |
dc.conference.name | 16th European Conference on Computer Vision, ECCV 2020 | |
dc.conference.location | Glasgow, GBR | |
dc.eprint.version | Post-print | |
dc.contributor.institution | Cardiff University, Cardiff, UK | |
dc.contributor.institution | UCL/Adobe Research, London, UK | |
dc.identifier.volume | 12361 LNCS | |
dc.identifier.pages | 310-326 | |
dc.identifier.arxivid | 1811.09567 | |
kaust.person | Qin, Yipeng | |
kaust.person | Wonka, Peter | |
kaust.grant.number | CRG2018-3730 | |
dc.identifier.eid | 2-s2.0-85092937698 | |
refterms.dateFOA | 2020-12-08T07:41:55Z | |
kaust.acknowledged.supportUnit | KAUST Office of Sponsored Research (OSR) | |
dc.date.published-online | 2020-10-10 | |
dc.date.published-print | 2020 |
Files in this item
This item appears in the following Collection(s)
-
Conference Papers
-
Computer Science Program
For more information visit: https://cemse.kaust.edu.sa/cs -
Visual Computing Center (VCC)
-
Computer, Electrical and Mathematical Science and Engineering (CEMSE) Division
For more information visit: https://cemse.kaust.edu.sa/