论文标题

优化近似剩余的交叉验证以调整超参数

Optimizing Approximate Leave-one-out Cross-validation to Tune Hyperparameters

论文作者

Burn, Ryan

论文摘要

对于大量的正规化模型,可以通过近似剩下的一式公式(ALO)进行有效估算的一对淘汰交叉验证。我们考虑调整超参数的问题以优化ALO。我们得出有效的公式来计算ALO的梯度和Hessian,并展示如何应用二阶优化器查找超参数。我们通过在各种现实世界数据集中找到用于正则逻辑回归和山脊回归的超参数来证明所提出方法的有用性。

For a large class of regularized models, leave-one-out cross-validation can be efficiently estimated with an approximate leave-one-out formula (ALO). We consider the problem of adjusting hyperparameters so as to optimize ALO. We derive efficient formulas to compute the gradient and hessian of ALO and show how to apply a second-order optimizer to find hyperparameters. We demonstrate the usefulness of the proposed approach by finding hyperparameters for regularized logistic regression and ridge regression on various real-world data sets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源