论文标题
基于梯度的数据和贝叶斯模型的参数尺寸减小:信息理论观点
Gradient-based data and parameter dimension reduction for Bayesian models: an information theoretic perspective
论文作者
论文摘要
我们考虑降低非高斯贝叶斯推理问题中参数和数据维度的问题。我们的目标是确定参数的“知情”子空间和数据的“信息性”子空间,以便可以在低到中等的维度中近似重新重新重新重新构成数据,从而提高许多推理技术的计算效率。为此,我们利用对数似然函数的梯度评估。此外,我们使用信息理论分析来得出由于参数和数据维度降低而导致的后误差的结合。这种结合依赖于对数Sobolev不等式,并且揭示了减少变量的适当维度。我们将我们的方法与经典降低技术进行比较,例如主成分分析和规范相关分析,从机械师到图像处理的应用。
We consider the problem of reducing the dimensions of parameters and data in non-Gaussian Bayesian inference problems. Our goal is to identify an "informed" subspace of the parameters and an "informative" subspace of the data so that a high-dimensional inference problem can be approximately reformulated in low-to-moderate dimensions, thereby improving the computational efficiency of many inference techniques. To do so, we exploit gradient evaluations of the log-likelihood function. Furthermore, we use an information-theoretic analysis to derive a bound on the posterior error due to parameter and data dimension reduction. This bound relies on logarithmic Sobolev inequalities, and it reveals the appropriate dimensions of the reduced variables. We compare our method with classical dimension reduction techniques, such as principal component analysis and canonical correlation analysis, on applications ranging from mechanics to image processing.