论文标题
pac-bayes释放:无限损失的概括边界
PAC-Bayes unleashed: generalisation bounds with unbounded losses
论文作者
论文摘要
我们提出了具有无限损失功能的学习问题的新的Pac-bayesian泛化范围。这扩展了Pac-Bayes学习框架的相关性和适用性,其中大多数现有文献都集中在有界损耗函数的监督学习问题上(通常假定在间隔[0; 1]中使用值)。为了放松这一假设,我们提出了一个称为hype的新概念(代表\ emph {假设依赖性范围}),该概念有效地允许损失范围取决于每个预测指标。基于这个新的概念,我们得出了一种新型的Pac-bayesian概括,以无限的损失函数结合,并在线性回归问题上实例化。为了使我们的理论成为可能的最大受众,我们包括有关实际计算,实用性和假设局限性的讨论。
We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss functions. This extends the relevance and applicability of the PAC-Bayes learning framework, where most of the existing literature focuses on supervised learning problems with a bounded loss function (typically assumed to take values in the interval [0;1]). In order to relax this assumption, we propose a new notion called HYPE (standing for \emph{HYPothesis-dependent rangE}), which effectively allows the range of the loss to depend on each predictor. Based on this new notion we derive a novel PAC-Bayesian generalisation bound for unbounded loss functions, and we instantiate it on a linear regression problem. To make our theory usable by the largest audience possible, we include discussions on actual computation, practicality and limitations of our assumptions.