Arenz, O., Zhong, M. and Neumann, Gerhard
(2018)
Efficient Gradient-Free Variational Inference using Policy Search.
In: Proceedings of the International Conference on Machine Learning.
Efficient Gradient-Free Variational Inference using Policy Search | | ![[img]](http://eprints.lincoln.ac.uk/32456/1.hassmallThumbnailVersion/VIPS_and_supplement.pdf) [Download] |
|
![[img]](http://eprints.lincoln.ac.uk/32456/1.hassmallThumbnailVersion/VIPS_and_supplement.pdf)  Preview |
|
PDF
VIPS_and_supplement.pdf
- Whole Document
3MB |
Item Type: | Conference or Workshop contribution (Paper) |
---|
Item Status: | Live Archive |
---|
Abstract
Inference from complex distributions is a common
problem in machine learning needed for
many Bayesian methods. We propose an efficient,
gradient-free method for learning general GMM
approximations of multimodal distributions based
on recent insights from stochastic search methods.
Our method establishes information-geometric
trust regions to ensure efficient exploration of the
sampling space and stability of the GMM updates,
allowing for efficient estimation of multi-variate
Gaussian variational distributions. For GMMs,
we apply a variational lower bound to decompose
the learning objective into sub-problems given
by learning the individual mixture components
and the coefficients. The number of mixture components
is adapted online in order to allow for
arbitrary exact approximations. We demonstrate
on several domains that we can learn significantly
better approximations than competing variational
inference methods and that the quality of samples
drawn from our approximations is on par
with samples created by state-of-the-art MCMC
samplers that require significantly more computational
resources.
Repository Staff Only: item control page