论文标题

通过单面预测进行选择性分类

Selective Classification via One-Sided Prediction

论文作者

Gangrade, Aditya, Kag, Anil, Saligrama, Venkatesh

论文摘要

我们提出了一种选择性分类(SC)的新方法,该方法使分类器可以弃权预测某些实例,从而对覆盖范围(预测的实例的比例)进行差异。与先前的基于门控或基于置信度的工作相反,我们提出的方法优化了班级脱钩的单方面经验风险的集合,并且本质上是一种明确找到每个类别的误差很少的级别决策集的方法。这种基于单方面的预测(OSP)放松产生了SC方案,该方案在实际相关的高目标准确度方案中获得了几乎最佳的覆盖范围,并进一步承认有效实施,从而实现了SC的灵活和原则性方法。从理论上讲,我们得出了SC和OSP的概括范围,从经验上我们表明,在较小的误差级别上,我们的方案在覆盖范围内强烈优于最先进的方法。

We propose a novel method for selective classification (SC), a problem which allows a classifier to abstain from predicting some instances, thus trading off accuracy against coverage (the fraction of instances predicted). In contrast to prior gating or confidence-set based work, our proposed method optimises a collection of class-wise decoupled one-sided empirical risks, and is in essence a method for explicitly finding the largest decision sets for each class that have few false positives. This one-sided prediction (OSP) based relaxation yields an SC scheme that attains near-optimal coverage in the practically relevant high target accuracy regime, and further admits efficient implementation, leading to a flexible and principled method for SC. We theoretically derive generalization bounds for SC and OSP, and empirically we show that our scheme strongly outperforms state of the art methods in coverage at small error levels.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源