论文标题

在可能的错误特异性下,非参数二进制和泊松回归的后逆转录

Posterior Convergence of Nonparametric Binary and Poisson Regression Under Possible Misspecifications

论文作者

Chatterjee, Debashis, Bhattacharya, Sourabh

论文摘要

在本文中,我们研究了可能具有适当特性的一般随机过程,在可能的模型错误指定下,在可能的模型错误指定下,非参数二元和泊松回归的后验收敛。我们的模型设置和二进制回归的目标与Ghosal和Roy(2006)相似,在该模型和Roy(2006)中,作者使用了熵结合的方法,并使用Sieve方法实现了指数一致的测试,以相对于其高斯过程达到一致性。相反,对于二元和泊松回归的情况,使用一般随机过程,我们的方法涉及验证渐近均衡特性以及筛子的方法,筛子是Shalizi(2009)一般结果的操纵,即使对于毫不指定的模型也很有用。此外,我们将不仅建立后验一致性,而且还将建立后验概率融合的速率,事实证明这是kullback-leibler差异率。 We also investgate the traditional posterior convergence rates.有趣的是,从主观的贝叶斯角度来看,我们将表明,后验预测分布可以准确地近似可能的预测分布,因为尽管存在错误的特异性,但Hellinger距离以及两个分布之间的总变化距离可能趋于零。

In this article, we investigate posterior convergence of nonparametric binary and Poisson regression under possible model misspecification, assuming general stochastic process prior with appropriate properties. Our model setup and objective for binary regression is similar to that of Ghosal and Roy (2006) where the authors have used the approach of entropy bound and exponentially consistent tests with the sieve method to achieve consistency with respect to their Gaussian process prior. In contrast, for both binary and Poisson regression, using general stochastic process prior, our approach involves verification of asymptotic equipartition property along with the method of sieve, which is a manoeuvre of the general results of Shalizi (2009), useful even for misspecified models. Moreover, we will establish not only posterior consistency but also the rates at which the posterior probabilities converge, which turns out to be the Kullback-Leibler divergence rate. We also investgate the traditional posterior convergence rates. Interestingly, from subjective Bayesian viewpoint we will show that the posterior predictive distribution can accurately approximate the best possible predictive distribution in the sense that the Hellinger distance, as well as the total variation distance between the two distributions can tend to zero, in spite of misspecifications.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源