Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities

Handle URI:
http://hdl.handle.net/10754/622680
Title:
Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities
Authors:
Nielsen, Frank; Sun, Ke
Abstract:
Information-theoreticmeasures, such as the entropy, the cross-entropy and the Kullback-Leibler divergence between two mixture models, are core primitives in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte Carlo stochastic integration, approximated or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy, the Kullback-Leibler and the α-divergences of mixtures. We illustrate the versatile method by reporting our experiments for approximating the Kullback-Leibler and the α-divergences between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures and Gamma mixtures.
KAUST Department:
Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
Citation:
Nielsen F, Sun K (2016) Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities. Entropy 18: 442. Available: http://dx.doi.org/10.3390/e18120442.
Publisher:
MDPI AG
Journal:
Entropy
Issue Date:
9-Dec-2016
DOI:
10.3390/e18120442
Type:
Article
ISSN:
1099-4300
Sponsors:
The authors gratefully thank the referees for their comments. This work was carried out while Ke Sun was visiting Frank Nielsen at Ecole Polytechnique, Palaiseau, France.
Additional Links:
http://www.mdpi.com/1099-4300/18/12/442
Appears in Collections:
Articles; Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division

Full metadata record

DC FieldValue Language
dc.contributor.authorNielsen, Franken
dc.contributor.authorSun, Keen
dc.date.accessioned2017-01-11T12:20:30Z-
dc.date.available2017-01-11T12:20:30Z-
dc.date.issued2016-12-09en
dc.identifier.citationNielsen F, Sun K (2016) Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities. Entropy 18: 442. Available: http://dx.doi.org/10.3390/e18120442.en
dc.identifier.issn1099-4300en
dc.identifier.doi10.3390/e18120442en
dc.identifier.urihttp://hdl.handle.net/10754/622680-
dc.description.abstractInformation-theoreticmeasures, such as the entropy, the cross-entropy and the Kullback-Leibler divergence between two mixture models, are core primitives in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte Carlo stochastic integration, approximated or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy, the Kullback-Leibler and the α-divergences of mixtures. We illustrate the versatile method by reporting our experiments for approximating the Kullback-Leibler and the α-divergences between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures and Gamma mixtures.en
dc.description.sponsorshipThe authors gratefully thank the referees for their comments. This work was carried out while Ke Sun was visiting Frank Nielsen at Ecole Polytechnique, Palaiseau, France.en
dc.publisherMDPI AGen
dc.relation.urlhttp://www.mdpi.com/1099-4300/18/12/442en
dc.rightsThis is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).en
dc.rights.urihttps://creativecommons.org/licenses/by-nc-sa/4.0/en
dc.subjectA-divergencesen
dc.subjectInformation geometryen
dc.subjectLog-sum-exp boundsen
dc.subjectMixture modelsen
dc.titleGuaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalitiesen
dc.typeArticleen
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Divisionen
dc.identifier.journalEntropyen
dc.eprint.versionPublisher's Version/PDFen
dc.contributor.institutionComputer Science Department LIX, École Polytechnique, Palaiseau Cedex, 91128, Franceen
dc.contributor.institutionSony Computer Science Laboratories Inc, Tokyo, 141-0022, Japanen
kaust.authorSun, Keen
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.