论文标题

距离的先验和约束的贝叶斯推断

Distance-to-Set Priors and Constrained Bayesian Inference

论文作者

Presman, Rick, Xu, Jason

论文摘要

在许多统计任务中,受限学习很普遍。最近的工作建议在可以指定为集合的一般约束下得出估计器的距离罚款,但着重于获得不带相应的不确定性度量的点估计值。为了解决这个问题,我们从贝叶斯镜头接近距离正则化。我们考虑一类平滑的距离先验,表明它们会产生明确的后代,以量化受限的学习问题的不确定性。我们讨论了关于贝叶斯约束放松的先前工作的关系和优势。此外,我们证明我们的方法在有限罚款参数的信息几何义中是最佳的,并且当$ρ\ to \ infty $时享有有利的统计属性。该方法旨在在基于梯度的MCMC采样器中有效执行,如一组模拟和真实数据应用程序所示。

Constrained learning is prevalent in many statistical tasks. Recent work proposes distance-to-set penalties to derive estimators under general constraints that can be specified as sets, but focuses on obtaining point estimates that do not come with corresponding measures of uncertainty. To remedy this, we approach distance-to-set regularization from a Bayesian lens. We consider a class of smooth distance-to-set priors, showing that they yield well-defined posteriors toward quantifying uncertainty for constrained learning problems. We discuss relationships and advantages over prior work on Bayesian constraint relaxation. Moreover, we prove that our approach is optimal in an information geometric-sense for finite penalty parameters $ρ$, and enjoys favorable statistical properties when $ρ\to\infty$. The method is designed to perform effectively within gradient-based MCMC samplers, as illustrated on a suite of simulated and real data applications.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源