论文标题
邻居正规化的贝叶斯优化用于超参数优化
Neighbor Regularized Bayesian Optimization for Hyperparameter Optimization
论文作者
论文摘要
贝叶斯优化(BO)是基于机器学习模型的样本观测值搜索最佳超参数的常见解决方案。当潜在的观察噪声误导优化时,现有的BO算法甚至可能会缓慢收敛。在本文中,我们提出了一种称为邻居正规贝叶斯优化(NRBO)的新型BO算法来解决该问题。我们首先提出一个基于邻居的正则化,以平滑每个样本观察,这可以有效地降低观察噪声,而无需任何额外的训练成本。由于邻居正则化高度取决于邻居区域的样本密度,因此我们进一步设计了基于密度的采集功能,以调整采集奖励并获得更稳定的统计数据。此外,我们设计了一种调整机制,以确保该框架保持合理的正则强度和密度奖励,以剩余的计算资源为条件。我们在贝叶斯标准基准和重要的计算机视觉基准(例如ImageNet和Coco)上进行实验。广泛的实验证明了NRBO的有效性,并且它始终优于其他最新方法。
Bayesian Optimization (BO) is a common solution to search optimal hyperparameters based on sample observations of a machine learning model. Existing BO algorithms could converge slowly even collapse when the potential observation noise misdirects the optimization. In this paper, we propose a novel BO algorithm called Neighbor Regularized Bayesian Optimization (NRBO) to solve the problem. We first propose a neighbor-based regularization to smooth each sample observation, which could reduce the observation noise efficiently without any extra training cost. Since the neighbor regularization highly depends on the sample density of a neighbor area, we further design a density-based acquisition function to adjust the acquisition reward and obtain more stable statistics. In addition, we design a adjustment mechanism to ensure the framework maintains a reasonable regularization strength and density reward conditioned on remaining computation resources. We conduct experiments on the bayesmark benchmark and important computer vision benchmarks such as ImageNet and COCO. Extensive experiments demonstrate the effectiveness of NRBO and it consistently outperforms other state-of-the-art methods.