论文标题

折刀+-After-bootstrap免费推断是免费的

Predictive Inference Is Free with the Jackknife+-after-Bootstrap

论文作者

Kim, Byol, Xu, Chen, Barber, Rina Foygel

论文摘要

集合学习被广泛用于应用程序中,以在复杂的决策问题中做出预测 - 例如,平均拟合到一系列样本中的模型,从可用的培训数据中进行了自举。尽管此类方法提供了更准确,稳定和鲁棒的预测和模型估计,但对于如何对这些类型的过程的输出进行有效的,假设斜体的推断知之甚少。在本文中,我们提出了Jackknife+-Fer-bootstrap(J+AB),这是一种构建预测间隔的过程,该过程仅使用可用的引导样品及其相应的拟合模型,因此就模型拟合成本而言是“免费的”。 J+AB提供了预测性的覆盖范围保证,该保证无需对数据的分布,拟合模型的性质或模型集合聚合的方式 - 最坏的情况,预测间隔的故障率被2倍膨胀。我们的数值实验验证了实际预测间隔的覆盖范围和准确性。

Ensemble learning is widely used in applications to make predictions in complex decision problems---for example, averaging models fitted to a sequence of samples bootstrapped from the available training data. While such methods offer more accurate, stable, and robust predictions and model estimates, much less is known about how to perform valid, assumption-lean inference on the output of these types of procedures. In this paper, we propose the jackknife+-after-bootstrap (J+aB), a procedure for constructing a predictive interval, which uses only the available bootstrapped samples and their corresponding fitted models, and is therefore "free" in terms of the cost of model fitting. The J+aB offers a predictive coverage guarantee that holds with no assumptions on the distribution of the data, the nature of the fitted model, or the way in which the ensemble of models are aggregated---at worst, the failure rate of the predictive interval is inflated by a factor of 2. Our numerical experiments verify the coverage and accuracy of the resulting predictive intervals on real data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源