论文标题
软巴特:软贝叶斯添加剂回归树
SoftBart: Soft Bayesian Additive Regression Trees
论文作者
论文摘要
近年来,作为通用非参数建模技术,贝叶斯添加剂回归树(BART)模型近年来的关注越来越大。 Bart将现代机器学习技术的灵活性与贝叶斯推论的原则不确定性量化相结合,并且已证明它是完全适合于解决许多科学领域(包括医学和社会科学)通常发生的高噪声问题的独特之处。本文介绍了用于拟合Linero and Yang(2018)的软巴特算法的软巴特软件包。除了提高其他BART包装的预测性能外,该软件包的主要目标是促进将Bart纳入大型模型,使其非常适合贝叶斯统计数据中的研究人员。我展示了如何将此软件包用于标准预测任务以及如何将BART模型嵌入大型模型中;我通过使用Softbart来实现非参数概率回归模型,半参数变化系数模型和部分线性模型来说明。
Bayesian additive regression tree (BART) models have seen increased attention in recent years as a general-purpose nonparametric modeling technique. BART combines the flexibility of modern machine learning techniques with the principled uncertainty quantification of Bayesian inference, and it has been shown to be uniquely appropriate for addressing the high-noise problems that occur commonly in many areas of science, including medicine and the social sciences. This paper introduces the SoftBart package for fitting the Soft BART algorithm of Linero and Yang (2018). In addition to improving upon the predictive performance of other BART packages, a major goal of this package has been to facilitate the inclusion of BART in larger models, making it ideal for researchers in Bayesian statistics. I show both how to use this package for standard prediction tasks and how to embed BART models in larger models; I illustrate by using SoftBart to implement a nonparametric probit regression model, a semiparametric varying coefficient model, and a partial linear model.