论文标题
重尾NGG混合模型
Heavy-Tailed NGG Mixture Models
论文作者
论文摘要
在实践中经常发现重尾,但它们是各种主流随机概率措施(例如Dirichlet过程(DP))的致命弱点。本文的第一个贡献重点是表征所谓的归一化伽玛(NGG)过程的尾巴。我们表明,只要居中分布本身是重尾,NGG过程的右尾巴是重尾的; DP是NGG类中唯一不遵守此方便属性的成员。该论文的第二个贡献取决于两类重型混合模型的开发以及对其相对优点的评估。此处设计了所提出的重尾混合物的多元扩展以及预测依赖性版本,以了解协变量对多变量重尾反应的影响。仿真研究表明,在各种情况下,所提出的方法表现良好,我们在神经科学数据集中展示了所提出的方法的应用。
Heavy tails are often found in practice, and yet they are an Achilles heel of a variety of mainstream random probability measures such as the Dirichlet process (DP). The first contribution of this paper focuses on characterizing the tails of the so-called normalized generalized gamma (NGG) process. We show that the right tail of an NGG process is heavy-tailed provided that the centering distribution is itself heavy-tailed; the DP is the only member of the NGG class that fails to obey this convenient property. A second contribution of the paper rests on the development of two classes of heavy-tailed mixture models and the assessment of their relative merits. Multivariate extensions of the proposed heavy-tailed mixtures are devised here, along with a predictor-dependent version, to learn about the effect of covariates on a multivariate heavy-tailed response. The simulation study suggests that the proposed method performs well in various scenarios, and we showcase the application of the proposed methods in a neuroscience dataset.