Model-based relative entropy stochastic search

Abdolmaleki, A., Lioutikov, R., Lua, N. , Paulo Reis, L., Peters, J. and Neumann, G. (2016) Model-based relative entropy stochastic search. In: Advances in Neural Information Processing Systems (NIPS), 20 - 24 July 2016, Denver, CO.

NIPS2015_5672.pdf - Whole Document

Item Type:Conference or Workshop contribution (Paper)
Item Status:Live Archive


Stochastic search algorithms are general black-box optimizers. Due to their ease
of use and their generality, they have recently also gained a lot of attention in operations
research, machine learning and policy search. Yet, these algorithms require
a lot of evaluations of the objective, scale poorly with the problem dimension, are
affected by highly noisy objective functions and may converge prematurely. To
alleviate these problems, we introduce a new surrogate-based stochastic search
approach. We learn simple, quadratic surrogate models of the objective function.
As the quality of such a quadratic approximation is limited, we do not greedily exploit
the learned models. The algorithm can be misled by an inaccurate optimum
introduced by the surrogate. Instead, we use information theoretic constraints to
bound the ‘distance’ between the new and old data distribution while maximizing
the objective function. Additionally the new method is able to sustain the exploration
of the search distribution to avoid premature convergence. We compare our
method with state of art black-box optimization methods on standard uni-modal
and multi-modal optimization functions, on simulated planar robot tasks and a
complex robot ball throwing task. The proposed method considerably outperforms
the existing approaches.

Keywords:Stochastic Search, Information Theory, Black-Box Optimization
Subjects:G Mathematical and Computer Sciences > G760 Machine Learning
Divisions:College of Science > School of Computer Science
Related URLs:
ID Code:25741
Deposited On:13 Mar 2017 11:13

Repository Staff Only: item control page