论文标题

辍学变量推断的模型不确定性的校准

Calibration of Model Uncertainty for Dropout Variational Inference

论文作者

Laves, Max-Heinrich, Ihler, Sontje, Kortmann, Karl-Philipp, Ortmaier, Tobias

论文摘要

通过用蒙特卡洛辍学的变异贝叶斯推断获得的模型不确定性很容易进行错误校准。在本文中,将不同的logit缩放方法扩展到辍学变量推断,以重新校准模型不确定性。预期的不确定性校准误差(UCE)作为测量错误校准的度量。重新校准的有效性是在CIFAR-10/100和SVHN上评估CNN架构的。实验结果表明,logit缩放大大减少了UCE的错误校准。精心校准的不确定性可以可靠地拒绝不确定的预测和对分布数据的可靠检测。

The model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. In this paper, different logit scaling methods are extended to dropout variational inference to recalibrate model uncertainty. Expected uncertainty calibration error (UCE) is presented as a metric to measure miscalibration. The effectiveness of recalibration is evaluated on CIFAR-10/100 and SVHN for recent CNN architectures. Experimental results show that logit scaling considerably reduce miscalibration by means of UCE. Well-calibrated uncertainty enables reliable rejection of uncertain predictions and robust detection of out-of-distribution data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源