论文标题

差异最近的邻居高斯过程

Variational Nearest Neighbor Gaussian Process

论文作者

Wu, Luhuan, Pleiss, Geoff, Cunningham, John

论文摘要

对高斯过程(GPS)的变异近似通常使用一小组诱导点来形成与协方差矩阵的低级别近似值。在这项工作中,我们相反利用了精度矩阵的稀疏近似。我们提出了差异最近的邻居高斯工艺(VNNGP),该过程引入了先验,该过程仅保留在$ k $最近的邻居观测中,从而诱导稀疏精度结构。使用变分框架,可以将VNNGP的目标分解在观测值和诱导点上,从而以$ O(k^3)$的时间复杂性实现随机优化。因此,我们可以任意扩展诱导点大小,甚至可以在每个观察到的位置放置诱导点。我们通过各种实验将VNNGP与其他可伸缩的GP进行比较,并证明VNNGP(1)可以极大地超过低级方法,而(2)比其他最近的邻居方法不容易过度拟合。

Variational approximations to Gaussian processes (GPs) typically use a small set of inducing points to form a low-rank approximation to the covariance matrix. In this work, we instead exploit a sparse approximation of the precision matrix. We propose variational nearest neighbor Gaussian process (VNNGP), which introduces a prior that only retains correlations within $K$ nearest-neighboring observations, thereby inducing sparse precision structure. Using the variational framework, VNNGP's objective can be factorized over both observations and inducing points, enabling stochastic optimization with a time complexity of $O(K^3)$. Hence, we can arbitrarily scale the inducing point size, even to the point of putting inducing points at every observed location. We compare VNNGP to other scalable GPs through various experiments, and demonstrate that VNNGP (1) can dramatically outperform low-rank methods, and (2) is less prone to overfitting than other nearest neighbor methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源