论文标题
与Pac-Bayes绑定同时控制多个错误
Controlling Multiple Errors Simultaneously with a PAC-Bayes Bound
论文作者
论文摘要
当前的Pac-Bayes概括界限仅限于性能标量指标,例如损失或错误率。但是,理想情况下,人们需要更多信息丰富的证书,以控制可能结果的整个分布,例如回归中测试损失的分布或不同分类的概率。我们提供了能够通过限制一组$ M $错误类型的经验和真实概率之间的kullback-leibler差异来提供能够提供此类丰富信息的第一个Pac-bayes,这可以分配为回归的损失值,或者是混淆矩阵(或其中一个分区)的元素进行分类。我们将自己的界限转变为一个可区分的培训目标。在不同分类的严重程度可能会随着时间而变化的情况下,我们的界限特别有用。现有的Pac-Bayes边界只能绑定误差类型的特定预定加权。相比之下,我们的界限隐含地控制了所有权重。
Current PAC-Bayes generalisation bounds are restricted to scalar metrics of performance, such as the loss or error rate. However, one ideally wants more information-rich certificates that control the entire distribution of possible outcomes, such as the distribution of the test loss in regression, or the probabilities of different mis-classifications. We provide the first PAC-Bayes bound capable of providing such rich information by bounding the Kullback-Leibler divergence between the empirical and true probabilities of a set of $M$ error types, which can either be discretized loss values for regression, or the elements of the confusion matrix (or a partition thereof) for classification. We transform our bound into a differentiable training objective. Our bound is especially useful in cases where the severity of different mis-classifications may change over time; existing PAC-Bayes bounds can only bound a particular pre-decided weighting of the error types. In contrast our bound implicitly controls all uncountably many weightings simultaneously.