论文标题

贝叶斯后部的拉普拉斯近似值如何?各种有用差异的有限样本可计算误差边界

How good is your Laplace approximation of the Bayesian posterior? Finite-sample computable error bounds for a variety of useful divergences

论文作者

Kasprzak, Mikołaj J., Giordano, Ryan, Broderick, Tamara

论文摘要

拉普拉斯近似是一种流行的方法,用于构建贝叶斯后部的高斯近似值,从而近似后均值和方差。但是近似质量是一个问题。人们可以考虑使用贝叶斯中央限制定理(BCLT)的某些版本的连接率界限来提供质量保证。但是现有的界限需要假设,即使对于相对简单的现实生活中的贝叶斯分析,也需要不现实的假设。更具体地说,现有边界要么(1)需要知道真实的数据生成参数,(2)在样本数量中是渐近的,(3)不能控制贝叶斯后均值,或者(4)需要强烈的记录凹入模型来计算。在这项工作中,我们提供了同时(1)不需要知道真实参数的质量的第一个可计算界限,(2)适用于有限样本,(3)控制后均值和方差,(4)通常适用于满足渐近性BCLT条件的模型。此外,我们显着提高了现有界限的维度依赖性;实际上,在一般情况下,我们实现了最低的尺寸依赖性。我们在包括逻辑回归在内的各种标准模型的范围内计算精确的常数,并在数值上证明了它们的效用。我们为分析更复杂的模型提供了一个框架。

The Laplace approximation is a popular method for constructing a Gaussian approximation to the Bayesian posterior and thereby approximating the posterior mean and variance. But approximation quality is a concern. One might consider using rate-of-convergence bounds from certain versions of the Bayesian Central Limit Theorem (BCLT) to provide quality guarantees. But existing bounds require assumptions that are unrealistic even for relatively simple real-life Bayesian analyses; more specifically, existing bounds either (1) require knowing the true data-generating parameter, (2) are asymptotic in the number of samples, (3) do not control the Bayesian posterior mean, or (4) require strongly log concave models to compute. In this work, we provide the first computable bounds on quality that simultaneously (1) do not require knowing the true parameter, (2) apply to finite samples, (3) control posterior means and variances, and (4) apply generally to models that satisfy the conditions of the asymptotic BCLT. Moreover, we substantially improve the dimension dependence of existing bounds; in fact, we achieve the lowest-order dimension dependence possible in the general case. We compute exact constants in our bounds for a variety of standard models, including logistic regression, and numerically demonstrate their utility. We provide a framework for analysis of more complex models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源