Coarse Grained Exponential Variational Autoencoders

Handle URI:
http://hdl.handle.net/10754/626472
Title:
Coarse Grained Exponential Variational Autoencoders
Authors:
Sun, Ke; Zhang, Xiangliang ( 0000-0002-3574-5665 )
Abstract:
Variational autoencoders (VAE) often use Gaussian or category distribution to model the inference process. This puts a limit on variational learning because this simplified assumption does not match the true posterior distribution, which is usually much more sophisticated. To break this limitation and apply arbitrary parametric distribution during inference, this paper derives a \emph{semi-continuous} latent representation, which approximates a continuous density up to a prescribed precision, and is much easier to analyze than its continuous counterpart because it is fundamentally discrete. We showcase the proposition by applying polynomial exponential family distributions as the posterior, which are universal probability density function generators. Our experimental results show consistent improvements over commonly used VAE models.
KAUST Department:
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Publisher:
arXiv
Issue Date:
25-Feb-2017
ARXIV:
arXiv:1702.07904
Type:
Preprint
Additional Links:
http://arxiv.org/abs/1702.07904v1; http://arxiv.org/pdf/1702.07904v1
Appears in Collections:
Other/General Submission; Other/General Submission; Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division

Full metadata record

DC FieldValue Language
dc.contributor.authorSun, Keen
dc.contributor.authorZhang, Xiangliangen
dc.date.accessioned2017-12-28T07:32:11Z-
dc.date.available2017-12-28T07:32:11Z-
dc.date.issued2017-02-25en
dc.identifier.urihttp://hdl.handle.net/10754/626472-
dc.description.abstractVariational autoencoders (VAE) often use Gaussian or category distribution to model the inference process. This puts a limit on variational learning because this simplified assumption does not match the true posterior distribution, which is usually much more sophisticated. To break this limitation and apply arbitrary parametric distribution during inference, this paper derives a \emph{semi-continuous} latent representation, which approximates a continuous density up to a prescribed precision, and is much easier to analyze than its continuous counterpart because it is fundamentally discrete. We showcase the proposition by applying polynomial exponential family distributions as the posterior, which are universal probability density function generators. Our experimental results show consistent improvements over commonly used VAE models.en
dc.publisherarXiven
dc.relation.urlhttp://arxiv.org/abs/1702.07904v1en
dc.relation.urlhttp://arxiv.org/pdf/1702.07904v1en
dc.rightsArchived with thanks to arXiven
dc.titleCoarse Grained Exponential Variational Autoencodersen
dc.typePreprinten
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Divisionen
dc.eprint.versionPre-printen
dc.identifier.arxividarXiv:1702.07904en
kaust.authorSun, Keen
kaust.authorZhang, Xiangliangen
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.