TU Darmstadt / ULB / TUbiblio

Trust-Region Variational Inference with Gaussian Mixture Models

Arenz, Oleg ; Zhong, Mingjun ; Neumann, Gerhard (2020)
Trust-Region Variational Inference with Gaussian Mixture Models.
In: Journal of Machine Learning Research, 21
Artikel, Bibliographie

Dies ist die neueste Version dieses Eintrags.

Kurzbeschreibung (Abstract)

Many methods for machine learning rely on approximate inference from intractable probability distributions. Variational inference approximates such distributions by tractable models that can be subsequently used for approximate inference. Learning sufficiently accurate approximations requires a rich model family and careful exploration of the relevant modes of the target distribution. We propose a method for learning accurate GMM approximations of intractable probability distributions based on insights from policy search by using information-geometric trust regions for principled exploration. For efficient improvement of the GMM approximation, we derive a lower bound on the corresponding optimization objective enabling us to update the components independently. Our use of the lower bound ensures convergence to a stationary point of the original objective. The number of components is adapted online by adding new components in promising regions and by deleting components with negligible weight. We demonstrate on several domains that we can learn approximations of complex, multimodal distributions with a quality that is unmet by previous variational inference methods, and that the GMM approximation can be used for drawing samples that are on par with samples created by state-of-the-art MCMC samplers while requiring up to three orders of magnitude less computational resources.

Typ des Eintrags: Artikel
Erschienen: 2020
Autor(en): Arenz, Oleg ; Zhong, Mingjun ; Neumann, Gerhard
Art des Eintrags: Bibliographie
Titel: Trust-Region Variational Inference with Gaussian Mixture Models
Sprache: Englisch
Publikationsjahr: 2020
Ort: Darmstadt
Verlag: JMLR
Titel der Zeitschrift, Zeitung oder Schriftenreihe: Journal of Machine Learning Research
Jahrgang/Volume einer Zeitschrift: 21
Kollation: 60 Seiten
Zugehörige Links:
Kurzbeschreibung (Abstract):

Many methods for machine learning rely on approximate inference from intractable probability distributions. Variational inference approximates such distributions by tractable models that can be subsequently used for approximate inference. Learning sufficiently accurate approximations requires a rich model family and careful exploration of the relevant modes of the target distribution. We propose a method for learning accurate GMM approximations of intractable probability distributions based on insights from policy search by using information-geometric trust regions for principled exploration. For efficient improvement of the GMM approximation, we derive a lower bound on the corresponding optimization objective enabling us to update the components independently. Our use of the lower bound ensures convergence to a stationary point of the original objective. The number of components is adapted online by adding new components in promising regions and by deleting components with negligible weight. We demonstrate on several domains that we can learn approximations of complex, multimodal distributions with a quality that is unmet by previous variational inference methods, and that the GMM approximation can be used for drawing samples that are on par with samples created by state-of-the-art MCMC samplers while requiring up to three orders of magnitude less computational resources.

Freie Schlagworte: approximate inference, variational inference, sampling, policy search, mcmc, markov chain monte carlo
Sachgruppe der Dewey Dezimalklassifikatin (DDC): 000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Intelligente Autonome Systeme
Hinterlegungsdatum: 02 Aug 2024 12:45
Letzte Änderung: 02 Aug 2024 12:45
PPN:
Export:
Suche nach Titel in: TUfind oder in Google

Verfügbare Versionen dieses Eintrags

Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen