论文标题

通过外推的riemannian加速梯度方法

Riemannian accelerated gradient methods via extrapolation

论文作者

Han, Andi, Mishra, Bamdev, Jawanpuria, Pratik, Gao, Junbin

论文摘要

在本文中,我们通过推断在歧管上的迭代来提出一种简单的加速度方案,以实现Riemannian梯度方法。我们显示何时从Riemannian梯度下降法生成迭代率,该加速方案是渐近地达到最佳收敛速率,并且比最近提出的Riemannian Nesterov加速梯度方法在计算上更有利。我们的实验验证了新型加速策略的实际好处。

In this paper, we propose a simple acceleration scheme for Riemannian gradient methods by extrapolating iterates on manifolds. We show when the iterates are generated from Riemannian gradient descent method, the accelerated scheme achieves the optimal convergence rate asymptotically and is computationally more favorable than the recently proposed Riemannian Nesterov accelerated gradient methods. Our experiments verify the practical benefit of the novel acceleration strategy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源