论文标题

高斯工艺样品的乐观优化

Optimistic Optimization of Gaussian Process Samples

论文作者

Grosse, Julia, Zhang, Cheng, Hennig, Philipp

论文摘要

贝叶斯优化是全球优化的流行形式主义,但其计算成本将其限制在昂贵的评估功能上。竞争性,计算更有效,全局优化框架是乐观的优化,它以差异函数的形式利用了有关搜索空间几何形状的先验知识。我们研究贝叶斯优化的概念优势可以与乐观优化的计算效率相结合。通过将内核映射到差异,我们获得了一种乐观的优化算法,用于贝叶斯优化设置,运行时最多为$ \ MATHCAL {O}(n \ log n \ log n)$。作为一个高级的收费,我们发现,当在相对较低的评估成本的目标上使用固定内核时,与贝叶斯优化相比,乐观的优化可以非常可取,而对于强耦合和参数模型,贝叶斯优化的良好实现也可以更好地实现,即使以低评估成本,贝叶斯优化也可以更好地执行。我们认为,几何和概率搜索之间存在一个新的研究领域,即比传统贝叶斯优化快速运行的方法,同时保留了贝叶斯优化的一些关键功能。

Bayesian optimization is a popular formalism for global optimization, but its computational costs limit it to expensive-to-evaluate functions. A competing, computationally more efficient, global optimization framework is optimistic optimization, which exploits prior knowledge about the geometry of the search space in form of a dissimilarity function. We investigate to which degree the conceptual advantages of Bayesian Optimization can be combined with the computational efficiency of optimistic optimization. By mapping the kernel to a dissimilarity, we obtain an optimistic optimization algorithm for the Bayesian Optimization setting with a run-time of up to $\mathcal{O}(N \log N)$. As a high-level take-away we find that, when using stationary kernels on objectives of relatively low evaluation cost, optimistic optimization can be strongly preferable over Bayesian optimization, while for strongly coupled and parametric models, good implementations of Bayesian optimization can perform much better, even at low evaluation cost. We argue that there is a new research domain between geometric and probabilistic search, i.e. methods that run drastically faster than traditional Bayesian optimization, while retaining some of the crucial functionality of Bayesian optimization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源