论文标题

基于神经模型的优化通过右审查的观测值

Neural Model-based Optimization with Right-Censored Observations

论文作者

Eggensperger, Katharina, Haase, Kai, Müller, Philipp, Lindauer, Marius, Hutter, Frank

论文摘要

在许多研究领域,我们仅在某些实验的真实响应值上观察到下限。在拟合回归模型以预测结果的分布时,我们不能简单地删除这些右审查的观测值,而是需要正确地对其进行建模。在这项工作中,我们关注审查数据的概念,基于基于模型的优化,在此工作中,过早终止评估(从而生成右审查数据)是效率的关键因素,例如,在搜索算法配置以最大程度地减少手头算法的运行时。神经网络(NNS)已被证明在基于模型的优化程序的核心方面很好地工作,在这里我们将其扩展以处理这些审查的观测值。我们提出(i)〜基于TOBIT模型的损耗函数,将审查的样本纳入训练中,(ii)使用网络集合来对后验分布进行建模。然而,为了有效地优化开头,我们建议使用汤普森采样S.T.我们只需要在每次迭代中训练一个NN即可。我们的实验表明,我们训练有素的回归模型比几个基线实现了更好的预测质量,并且我们的方法实现了对两个优化问题的基于模型的优化的新最新性能:最大程度地减少SAT求解器的解决方案时间和神经网络的准确性。

In many fields of study, we only observe lower bounds on the true response value of some experiments. When fitting a regression model to predict the distribution of the outcomes, we cannot simply drop these right-censored observations, but need to properly model them. In this work, we focus on the concept of censored data in the light of model-based optimization where prematurely terminating evaluations (and thus generating right-censored data) is a key factor for efficiency, e.g., when searching for an algorithm configuration that minimizes runtime of the algorithm at hand. Neural networks (NNs) have been demonstrated to work well at the core of model-based optimization procedures and here we extend them to handle these censored observations. We propose (i)~a loss function based on the Tobit model to incorporate censored samples into training and (ii) use an ensemble of networks to model the posterior distribution. To nevertheless be efficient in terms of optimization-overhead, we propose to use Thompson sampling s.t. we only need to train a single NN in each iteration. Our experiments show that our trained regression models achieve a better predictive quality than several baselines and that our approach achieves new state-of-the-art performance for model-based optimization on two optimization problems: minimizing the solution time of a SAT solver and the time-to-accuracy of neural networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源