论文标题
准黑框变分推断,自然梯度用于贝叶斯学习
Quasi Black-Box Variational Inference with Natural Gradients for Bayesian Learning
论文作者
论文摘要
我们开发了一种适用于复杂模型中贝叶斯学习的优化算法。我们的方法依赖于一般的黑盒框架内的自然梯度更新,用于有限的特定模型推导,以进行有效的培训。它适用于指数型的各种后部分布的类别,为此我们广泛讨论了高斯案例,更新的形式相当简单。我们的准黑框变分推断(QBVI)框架很容易适用于广泛的贝叶斯推理问题,并且具有简单的实现,因为变异后验的更新不涉及模型参数,也不涉及渔民信息矩阵的处方。我们为后协方差矩阵开发了QBVI,讨论有关其强大和可行的实施的详细信息,并提供许多现实世界应用以证明其有效性。
We develop an optimization algorithm suitable for Bayesian learning in complex models. Our approach relies on natural gradient updates within a general black-box framework for efficient training with limited model-specific derivations. It applies within the class of exponential-family variational posterior distributions, for which we extensively discuss the Gaussian case for which the updates have a rather simple form. Our Quasi Black-box Variational Inference (QBVI) framework is readily applicable to a wide class of Bayesian inference problems and is of simple implementation as the updates of the variational posterior do not involve gradients with respect to the model parameters, nor the prescription of the Fisher information matrix. We develop QBVI under different hypotheses for the posterior covariance matrix, discuss details about its robust and feasible implementation, and provide a number of real-world applications to demonstrate its effectiveness.